Immersive Visualization / IQ-Station Wiki
This site hosts information on virtual reality systems that are geared toward scientific visualization, and as such often toward VR on Linux-based systems. Thus, pages here cover various software (and sometimes hardware) technologies that enable virtual reality operation on Linux.
The original IQ-station effort was to create low-cost (for the time) VR systems making use of 3DTV displays to produce CAVE/Fishtank-style VR displays. That effort pre-dated the rise of the consumer HMD VR systems, however, the realm of midrange-cost large-fishtank systems is still important, and has transitioned from 3DTV-based systems to short-throw projectors.
Difference between revisions of "Portable VR Setup"
m (Small tweaks on the tracking configuration session as Simon practices) |
m (Tweaked the tracking calculation) |
||
Line 87: | Line 87: | ||
# Run <CODE>vruiddtest</CODE> (on vrmenu) | # Run <CODE>vruiddtest</CODE> (on vrmenu) | ||
# Place a tracking controller or puck at the front end of the leg for each of the screen stands | # Place a tracking controller or puck at the front end of the leg for each of the screen stands | ||
#: ''(For best results, place the trackers in the same way on both sides.)'' | |||
## Record the location on the left side and the record the location on the right side | ## Record the location on the left side and the record the location on the right side | ||
## Create a 3-vector (delta) by subtracting the left side '''from''' the right side | ## Create a 3-vector (delta) by subtracting the left side '''from''' the right side | ||
Line 92: | Line 93: | ||
## Calculate the angle of rotation needed to make the movement parallel to the screen from left to right is along the positive X-axis: | ## Calculate the angle of rotation needed to make the movement parallel to the screen from left to right is along the positive X-axis: | ||
##* One tool to make this calculation can by Python3 with the <CODE>math</CODE> library imported | ##* One tool to make this calculation can by Python3 with the <CODE>math</CODE> library imported | ||
##* <CODE>>>>math | ##* <CODE>>>> import math</CODE> | ||
##* <CODE>>>> x_delta = x_left - x_right</CODE> | |||
##: NOTE: sometimes "- | ##* <CODE>>>> x_delta = z_left - z_right</CODE> | ||
##* <CODE>>>> math.atan2(x_delta, z_delta) * (180.0 / math.pi) - 45.0</CODE> | |||
##: NOTE: sometimes "-45.0" isn't quite right, and you may need to tweak in 45° increments — it's not yet clear why | |||
##: Maybe try the following instead: | |||
##:* <CODE>>>> math.atan2(-z_delta, -x_delta) * (180.0 / math.pi)</CODE> | |||
##:: (which of course would be simplified by calculating the delta in the opposite direction) | |||
## Edit the <CODE>rc_sample_vruiview</CODE> FreeVR configuration file | ## Edit the <CODE>rc_sample_vruiview</CODE> FreeVR configuration file | ||
##* Under <CODE>inputdevice vruidd-input-vive</CODE>, set the tracking rotation to be: | ##* Under <CODE>inputdevice vruidd-input-vive</CODE>, set the tracking rotation to be: |
Revision as of 16:12, 10 January 2024
The current large-fishtank/CAVE-style portable VR system developed at NIST continues the tradition of "lower-cost" VR systems from the Desert Research Institute, UC-Davis, Indiana University, and the Idaho National Laboratory. The available technology has changed over the years, and so we take the lessons learned from the past and apply that to the technologies available in the present (which presently is 2022).
Tracking
One of the big cost savings in the modern IQ-station-style VR system comes from the use of consumer position tracking technology designed for HMD systems, but that also can be used for fishtank/CAVE systems. In particular, for position tracking, we are using the Lighthouse system from Vive/Valve, including a tracking puck for the head, and one or more of the hand-held controllers.
Projection Screen
We use the 7.5' wide dual-side (front or rear projectsion) screen from ScreenWorks. This has a hinged metal frame that can easily be put together by two people -- doable but less easy for one person.
Projector
(add details)
Computing Platform
For immersive visualization applications the more popular computing operating system remains Linux, and thus the software we work with is primarily Linux-based. Presently we are using the Rocky-8 Linux distribution which replaces the position that Centos-8 had held as a trailing distribution to Redhat-Enterprise Linux.
Software
(add details)
Setup Process
This process is generally in the order in which the setup should proceed.
Projection Screen
The best steps for assembling the screen:
- Unfold and lock in the two side stands
- NOTE: the labels for the stands go on the back-side of the screen; The "stage right" stand is on the right side when standing behind the screen looking out toward the "audience" side.
- Loop the security cable around the frame stand
- Choose the stand on the side on which the computer will be placed
- Unfold and lock the screen frame
- Lean the two stands 7.5' apart (with the "stage-right" one on the the right)
- If doing as a one person setup, you'll have to find something to lean the stands against
- Place the screen frame on the stands with the bottom of the frame set to be 18" above the floor
- Screw the frame attachment screws through the stand into the frame
- NOTE: There are 3 screws for each side
- NOTE: Add the S-hooks or S-carabiner to the screen screw shroud prior to attaching the frame
- Unfold the screen itself and snap it onto the frame
- NOTE: ideally the screen should not touch varnished surfaces, and maybe floor surfaces
- NOTE also: the skirt is generally pre-attached to the screen to make it easier for setup and take-down
- Raise the screen
- Mount the power strip onto the stand near where the computer will sit
- Mount the two Lighthouse tracker units on the top of each frame using the Magic-Arm
- Point the Lighthouses mostly out toward the audience, but slightly tilted inward
Setup Projector
The BenQ projector with the Screenworks screen can operate on either side of the screen, but for a VR display where users can walk up to the screen, the obvious setup is to place the projector behind the screen.
- Place the projector on the floor (or perhaps a thin flat surface) centered behind the screen
- Plug the projector power on the power strip
- Connect the Volfoni RF emitter to the 3-pin minidin jack
- Connect the projector to the computer
- (either HDMI or Display Port)
- Power on the projector
- Set the projector in to Grid test pattern
- Adjust the projector (and perhaps the screen) such that the screen is essentially filled by the pattern and the lines are parallel to the edges.
- Return the projector to the computer input (vs. the test pattern)
(add more details)
Lighthouse Tracking Hardware
(add details)
Tracking Configuration
The Lighthouse tracking system (add details)
- Start
vrmenu
- Run
Vrui VR Device Daemon
(on vrmenu) - Run
vruiddtest
(on vrmenu) - Place a tracking controller or puck at the front end of the leg for each of the screen stands
- (For best results, place the trackers in the same way on both sides.)
- Record the location on the left side and the record the location on the right side
- Create a 3-vector (delta) by subtracting the left side from the right side
- NOTE: The Y-value of the vector should be essentially zero
- Calculate the angle of rotation needed to make the movement parallel to the screen from left to right is along the positive X-axis:
- One tool to make this calculation can by Python3 with the
math
library imported >>> import math
>>> x_delta = x_left - x_right
>>> x_delta = z_left - z_right
>>> math.atan2(x_delta, z_delta) * (180.0 / math.pi) - 45.0
- NOTE: sometimes "-45.0" isn't quite right, and you may need to tweak in 45° increments — it's not yet clear why
- Maybe try the following instead:
>>> math.atan2(-z_delta, -x_delta) * (180.0 / math.pi)
- (which of course would be simplified by calculating the delta in the opposite direction)
- One tool to make this calculation can by Python3 with the
- Edit the
rc_sample_vruiview
FreeVR configuration file- Under
inputdevice vruidd-input-vive
, set the tracking rotation to be:t2rw_rotate = 0.0, 1.0, 0.0, value;
- (in the default
rc_sample_vruiview
this will be under a comment label: #### Define the position trackers)
- (in the default
- Save this and quit and restart
FreeVR inputs
- Under
- Set the tracker on the floor, centered between the left and right side, and 3 feet out from the screen
- NOTE: This will be the FreeVR origin
- NOTE: In a typical CAVE this might be the center of the floor, or for the NIST CAVE it is 4feet from the front and left walls
- Edit (again)
vruiserver
FreeVR configuration file- Under
inputdevice vruidd-input-vive
, set the tracking translation to be:t2rw_translate *= -X, -Y, -Z;
- Save this and quit and restart
FreeVR inputs
- Under
Test the Tracking Configuration
- Ensure
FreeVR inputs
is running - Move the tracking controller around the edge of the physical projection screen, and this should match the rose outlined screen in the
inputs
window. - Run
vrpntest
fromvrmenu
- As the trackers are moved around the values reported should be in ParaView coordinates, which are the same as the FreeVR coordinates except with the Y-value (height) of the origin set to be 5 feet off the ground
- Quit
vrpntest
with Ctrl-C
Test ParaView
(add details here)