Immersive Visualization / IQ-Station Wiki

This site hosts information on virtual reality systems that are geared toward scientific visualization, and as such often toward VR on Linux-based systems. Thus, pages here cover various software (and sometimes hardware) technologies that enable virtual reality operation on Linux.

The original IQ-station effort was to create low-cost (for the time) VR systems making use of 3DTV displays to produce CAVE/Fishtank-style VR displays. That effort pre-dated the rise of the consumer HMD VR systems, however, the realm of midrange-cost large-fishtank systems is still important, and has transitioned from 3DTV-based systems to short-throw projectors.

Difference between revisions of "Portable VR Setup"

From IQ-Station Wiki
Jump to navigation Jump to search
(Create a page that describes the step-by-step process of setting up the Portable-VR system)
 
m (Minor tweaks)
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{TOCright}}
The current large-fishtank/CAVE-style portable VR system developed at NIST
The current large-fishtank/CAVE-style portable VR system developed at NIST
continues the tradition of "lower-cost" VR systems from the Desert Research Institute,
continues the tradition of "lower-cost" VR systems from the Desert Research Institute,
Line 4: Line 5:
The available technology has changed over the years, and so we take
The available technology has changed over the years, and so we take
the lessons learned from the past and apply that to the technologies
the lessons learned from the past and apply that to the technologies
available in the present (which presently is 2022).
available in the present (which presently is 2022-2024).


== Tracking ==
== Tracking ==
One of the big cost savings in the modern IQ-station-style VR system
One of the big cost savings in the modern IQ-station-style VR system
comes from the use of consumer position tracking technology designed
comes from the use of consumer position tracking technology designed
for HMD systems, but that also can be used for fishtank/CAVE systems.
for HMD systems, but these also can be used for fishtank/CAVE systems.
In particular, for position tracking, we are using the Lighthouse system
In particular, for position tracking, we are using the [[ViveLighthouse | Vive Lighthouse system]]
from Vive/Valve, including a tracking puck for the head, and one or more
from HTC/Valve, including a tracking puck for the head, and one or more
of the hand-held controllers.  
of the hand-held controllers.  


== Projection Screen ==
== Projection Screen ==


We use the 7.5' wide dual-side (front or rear projectsion) screen from
We use the 7.5' wide dual-side (front or rear projection) screen from
ScreenWorks.  This has a hinged metal frame that can easily be put together
ScreenWorks.  This has a hinged metal frame that can easily be put together
by two people -- doable but less easy for one person.
by two people -- doable but less easy for one person.
Line 44: Line 45:
# Loop the security cable around the frame stand
# Loop the security cable around the frame stand
#: Choose the stand on the side on which the computer will be placed
#: Choose the stand on the side on which the computer will be placed
# Unfold the screen frame
# Unfold and lock the screen frame
# Lean the two stands 7.5' apart (with the "stage-right" one on the the right)
# Lean the two stands 7.5' apart (with the "stage-right" one on the the right)
#: If doing as a one person setup, you'll have to find something to lean the stands against
#: If doing as a one person setup, you'll have to find something to lean the stands against
Line 68: Line 69:
# Connect the projector to the computer
# Connect the projector to the computer
#: (either HDMI or Display Port)
#: (either HDMI or Display Port)
# Power on the projector
# Power-on the projector
# Set the projector in to Grid test pattern
# Set the projector in to Grid test pattern
# Adjust the projector (and perhaps the screen) such that the screen is essentially filled by the pattern and the lines are parallel to the edges.
# Adjust the projector (and perhaps the screen) such that the screen is essentially filled by the pattern and the lines are parallel to the edges.
Line 87: Line 88:
# Run <CODE>vruiddtest</CODE> (on vrmenu)
# Run <CODE>vruiddtest</CODE> (on vrmenu)
# Place a tracking controller or puck at the front end of the leg for each of the screen stands
# Place a tracking controller or puck at the front end of the leg for each of the screen stands
#: ''(For best results, place the trackers in the same way on both sides.)''
## Record the location on the left side and the record the location on the right side
## Record the location on the left side and the record the location on the right side
## Create a 3-vector by subtracting the left side '''from''' the right side
## Create a 3-vector (delta) by subtracting the left side '''from''' the right side
##* The Y-value of the vector should be essentially zero
##* NOTE: The Y-value of the vector should be essentially zero
## Calculate the angle of rotation needed to make the movement parallel to the screen from left to right is along the positive X-axis:
## Calculate the angle of rotation needed to make the movement parallel to the screen from left to right is along the positive X-axis:
##* One tool to make this calculation can by Python3 with the <CODE>math</CODE> library imported
##* One tool to make this calculation can by Python3 with the <CODE>math</CODE> library imported
##* <CODE>math.atan(X-delta / Z-delta) * 180.0 / math.pi + 90.0</CODE>
##* <CODE>&GT;&GT;&GT; import math</CODE>
##*: Where X-delta &amp; Z-delta are the X &amp; Z values from the calculated vector
##* <CODE>&GT;&GT;&GT; x_delta = x_left - x_right</CODE>
## Edit the <CODE>vruiserver</CODE> FreeVR configuration file
##* <CODE>&GT;&GT;&GT; z_delta = z_left - z_right</CODE>
##* <CODE>&GT;&GT;&GT; math.atan2(x_delta, z_delta) * (180.0 / math.pi) - 45.0</CODE>
##: NOTE: sometimes "-45.0" isn't quite right, and you may need to tweak in 45&deg; increments &mdash; it's not yet clear why
##: Maybe try the following instead '''(in fact I think this may be the correct solution)''':
##:* <CODE>&GT;&GT;&GT; math.atan2(-z_delta, -x_delta) * (180.0 / math.pi)</CODE>
##:: (which of course would be simplified by calculating the delta in the opposite direction)
## Edit the <CODE>rc_sample_vruiview</CODE> FreeVR configuration file
##* Under <CODE>inputdevice vruidd-input-vive</CODE>, set the tracking rotation to be:
##* Under <CODE>inputdevice vruidd-input-vive</CODE>, set the tracking rotation to be:
##** <CODE>t2rw.rotate = 0.0, 1.0, 0.0, value;</CODE>
##** <CODE>t2rw_rotate = 0.0, 1.0, 0.0, value;</CODE>
##**:(in the default <CODE>rc_sample_vruiview</CODE> this will be under a comment label: ''#### Define the position trackers'')
##** Save this and quit and restart <CODE>FreeVR inputs</CODE>
##** Save this and quit and restart <CODE>FreeVR inputs</CODE>
## Set the tracker on the floor, centered between the left and right side, and 3 feet out from the screen
## Set the tracker on the floor, centered between the left and right side, and 3 feet out from the screen
##: NOTE: This will be the FreeVR origin
##: NOTE: This will be the FreeVR origin
##: NOTE: In a typical CAVE this might be the center of the floor, or for the NIST CAVE it is 4feet from the front and left walls
## Edit (again) <CODE>vruiserver</CODE> FreeVR configuration file
## Edit (again) <CODE>vruiserver</CODE> FreeVR configuration file
##* Under <CODE>inputdevice vruidd-input-vive</CODE>, set the tracking translation to be:
##* Under <CODE>inputdevice vruidd-input-vive</CODE>, set the tracking translation to be:
##** <CODE>t2rw.translate *= -X, -Y, -Z;</CODE>
##** <CODE>t2rw_translate *= -X, -Y, -Z;</CODE>
##** Save this and quit and restart <CODE>FreeVR inputs</CODE>
##** Save this and quit and restart <CODE>FreeVR inputs</CODE>



Latest revision as of 17:48, 30 August 2024

The current large-fishtank/CAVE-style portable VR system developed at NIST continues the tradition of "lower-cost" VR systems from the Desert Research Institute, UC-Davis, Indiana University, and the Idaho National Laboratory. The available technology has changed over the years, and so we take the lessons learned from the past and apply that to the technologies available in the present (which presently is 2022-2024).

Tracking

One of the big cost savings in the modern IQ-station-style VR system comes from the use of consumer position tracking technology designed for HMD systems, but these also can be used for fishtank/CAVE systems. In particular, for position tracking, we are using the Vive Lighthouse system from HTC/Valve, including a tracking puck for the head, and one or more of the hand-held controllers.

Projection Screen

We use the 7.5' wide dual-side (front or rear projection) screen from ScreenWorks. This has a hinged metal frame that can easily be put together by two people -- doable but less easy for one person.

Projector

(add details)

Computing Platform

For immersive visualization applications the more popular computing operating system remains Linux, and thus the software we work with is primarily Linux-based. Presently we are using the Rocky-8 Linux distribution which replaces the position that Centos-8 had held as a trailing distribution to Redhat-Enterprise Linux.

Software

(add details)

Setup Process

This process is generally in the order in which the setup should proceed.

Projection Screen

The best steps for assembling the screen:

  1. Unfold and lock in the two side stands
    NOTE: the labels for the stands go on the back-side of the screen; The "stage right" stand is on the right side when standing behind the screen looking out toward the "audience" side.
  2. Loop the security cable around the frame stand
    Choose the stand on the side on which the computer will be placed
  3. Unfold and lock the screen frame
  4. Lean the two stands 7.5' apart (with the "stage-right" one on the the right)
    If doing as a one person setup, you'll have to find something to lean the stands against
  5. Place the screen frame on the stands with the bottom of the frame set to be 18" above the floor
  6. Screw the frame attachment screws through the stand into the frame
    NOTE: There are 3 screws for each side
    NOTE: Add the S-hooks or S-carabiner to the screen screw shroud prior to attaching the frame
  7. Unfold the screen itself and snap it onto the frame
    NOTE: ideally the screen should not touch varnished surfaces, and maybe floor surfaces
    NOTE also: the skirt is generally pre-attached to the screen to make it easier for setup and take-down
  8. Raise the screen
  9. Mount the power strip onto the stand near where the computer will sit
  10. Mount the two Lighthouse tracker units on the top of each frame using the Magic-Arm
    Point the Lighthouses mostly out toward the audience, but slightly tilted inward

Setup Projector

The BenQ projector with the Screenworks screen can operate on either side of the screen, but for a VR display where users can walk up to the screen, the obvious setup is to place the projector behind the screen.

  1. Place the projector on the floor (or perhaps a thin flat surface) centered behind the screen
  2. Plug the projector power on the power strip
  3. Connect the Volfoni RF emitter to the 3-pin minidin jack
  4. Connect the projector to the computer
    (either HDMI or Display Port)
  5. Power-on the projector
  6. Set the projector in to Grid test pattern
  7. Adjust the projector (and perhaps the screen) such that the screen is essentially filled by the pattern and the lines are parallel to the edges.
  8. Return the projector to the computer input (vs. the test pattern)

(add more details)

Lighthouse Tracking Hardware

(add details)

Tracking Configuration

The Lighthouse tracking system (add details)

  1. Start vrmenu
  2. Run Vrui VR Device Daemon (on vrmenu)
  3. Run vruiddtest (on vrmenu)
  4. Place a tracking controller or puck at the front end of the leg for each of the screen stands
    (For best results, place the trackers in the same way on both sides.)
    1. Record the location on the left side and the record the location on the right side
    2. Create a 3-vector (delta) by subtracting the left side from the right side
      • NOTE: The Y-value of the vector should be essentially zero
    3. Calculate the angle of rotation needed to make the movement parallel to the screen from left to right is along the positive X-axis:
      • One tool to make this calculation can by Python3 with the math library imported
      • >>> import math
      • >>> x_delta = x_left - x_right
      • >>> z_delta = z_left - z_right
      • >>> math.atan2(x_delta, z_delta) * (180.0 / math.pi) - 45.0
      NOTE: sometimes "-45.0" isn't quite right, and you may need to tweak in 45° increments — it's not yet clear why
      Maybe try the following instead (in fact I think this may be the correct solution):
      • >>> math.atan2(-z_delta, -x_delta) * (180.0 / math.pi)
      (which of course would be simplified by calculating the delta in the opposite direction)
    4. Edit the rc_sample_vruiview FreeVR configuration file
      • Under inputdevice vruidd-input-vive, set the tracking rotation to be:
        • t2rw_rotate = 0.0, 1.0, 0.0, value;
          (in the default rc_sample_vruiview this will be under a comment label: #### Define the position trackers)
        • Save this and quit and restart FreeVR inputs
    5. Set the tracker on the floor, centered between the left and right side, and 3 feet out from the screen
      NOTE: This will be the FreeVR origin
      NOTE: In a typical CAVE this might be the center of the floor, or for the NIST CAVE it is 4feet from the front and left walls
    6. Edit (again) vruiserver FreeVR configuration file
      • Under inputdevice vruidd-input-vive, set the tracking translation to be:
        • t2rw_translate *= -X, -Y, -Z;
        • Save this and quit and restart FreeVR inputs

Test the Tracking Configuration

  1. Ensure FreeVR inputs is running
  2. Move the tracking controller around the edge of the physical projection screen, and this should match the rose outlined screen in the inputs window.
  3. Run vrpntest from vrmenu
    • As the trackers are moved around the values reported should be in ParaView coordinates, which are the same as the FreeVR coordinates except with the Y-value (height) of the origin set to be 5 feet off the ground
    • Quit vrpntest with Ctrl-C

Test ParaView

(add details here)