Immersive Visualization / IQ-Station Wiki

This site hosts information on virtual reality systems that are geared toward scientific visualization, and as such often toward VR on Linux-based systems. Thus, pages here cover various software (and sometimes hardware) technologies that enable virtual reality operation on Linux.

The original IQ-station effort was to create low-cost (for the time) VR systems making use of 3DTV displays to produce CAVE/Fishtank-style VR displays. That effort pre-dated the rise of the consumer HMD VR systems, however, the realm of midrange-cost large-fishtank systems is still important, and has transitioned from 3DTV-based systems to short-throw projectors.

Streaming

From IQ-Station Wiki
Jump to navigation Jump to search

This page describes my work developing test code (and some working code) to stream video from a running VR application to a web page, and ultimately to a webpage generating a WebXR_Linux output that can be viewed in an HMD.

This document will divide the streaming tasks into the component elements and then bring it together in the end. Those components are:

  • Generating a test video stream (generally using FFmpeg tools)
  • Serving the WebRTC stream through a Janus WebRTC server
  • Viewing a WebRTC stream with FFmpeg tools
  • Viewing a WebRTC stream in WebGL
  • Viewing a WebRTC stream in WebXR
  • Generating a real video stream (from a VR application)

Generating a test video stream

The goal is to generate the video stream from within the VR application, but when testing the Janus WebRTC server, it's convenient to have the ability to just have a test stream running. Here are some examples.

Stream parameters

In order to establish communications between the software generating the stream, and the software receiving the stream, there must be agreement (or a means to come to agreement) on what port(s) to use for the streams. Of course these can be hard-coded into the tools (or the command line), but usually they are part of the configuration.

In many of the examples below, the ports used reflect how they are configured in the Janus server:

  • Web traffic: 8000 (HTTP)
  • Video port: 8004 (RTP)
  • Audio port: 8005 (RTP)
  • Janus queries: 7088 (JSON)
  • Janus admin (WebSocket signals) port: 8088 (JSON)

Playing from a video file

This method takes an existing video file (here a video of an early immersive volume visualization tool), and converts to the H.264 (video) and Opus (audio) codecs, and streamed on ports 8004 and 8005 respectively:

% ffmpeg -re -i ~/Videos/crumbs.mp4 -an -c:v libx264 -profile:v baseline -b:v 1M -flags global_header -bsf dump_extra -f rtp rtp://localhost:8004 -vn -codec:a libopus -f rtp rtp://localhost:8005

This method takes the same video file, and this time only only streams the video portion of the movie file, and also loops so the video will continue to stream indefinitely:

% ffmpeg -re -stream_loop -1 -i ~/Videos/crumbs.mp4 -s 426x240 -c:v libx264 -profile:v baseline -b:v 1M -r 24 -g 60 -an -f rtp rtp://127.0.0.1:8004
[ASIDE: Locally, I generally use the ffmpeg / ffplay options -hide_banner -loglevel error to reduce the (usually) extraneous output from these tools. However, for clarity, I do not include them on this page so the important options are highlighted.]

Generating a video pattern

This method generates a SMPTE color bar pattern:

% ffmpeg -re -f lavfi -i "smptehdbars=rate=30:size=1920x1080" -c:v libx264 -profile:v baseline  -f rtp -sdp_file streamdata.sdp rtp://localhost:8004

This command line adds to the SMPTE color bars a message overlay that includes the local time (so you can see that the stream is changing):

% ffmpeg -re -f lavfi -i "smptehdbars=rate=30:size=1920x1080" -vf drawtext="text='YOUR MESSAGE %{localtime\:%X}':rate=30:x=(w-tw)/2:y=(h-lh)/2:fontsize=48:fontcolor=white:box=1:boxcolor=black" -c:v libx264 -profile:v baseline -pix_fmt yuv420p -preset ultrafast -tune zerolatency -crf 28 -g 60 -f rtp rtp://localhost:8004

There are additional tweaks to the H.264 codec that can be controlled by the -x264-params option:

  • keyint=<number of frames> — send IDR frame (Instantaneous Decoder Refresh Frame) every N frames
  • scenecut=<0 or 1> — 0 Disables scene-cut detection to ensure keyframes appear at regular intervals
  • repeat-headers=<0 or 1> — 0 stops sending SPS/PPS in every keyframe (which means the receiving software must be operational first so it gets the information)
  • no-slice-header-reinsertion=<0 or 1> — 1 prevents inserting SPS/PPS with every IDR frame
For example:
% ffmpeg -hide_banner -re -f lavfi -i "smptehdbars=rate=30:size=1920x1080" -vf drawtext="text='YOUR MESSAGE %{localtime\:%X}':rate=30:x=(w-tw)/2:y=(h-lh)/2:fontsize=48:fontcolor=white:box=1:boxcolor=black" -c:v libx264 -profile:v baseline -pix_fmt yuv420p -preset ultrafast -tune zerolatency -x264-params "keyint=15:scenecut=0" -crf 28 -g 60 -f rtp rtp://localhost:8004

Receiving an A/V stream

Prior to running Janus, or some other relay tool, it's possible to confirm RTP streams are working using ffplay (or ffmpeg). To provide the RTP information, an SDP (Session Description Protocol) file is needed. The SDP file contains the parameters of a multimedia session (such as information about an RTP stream). When using Janus, a separate port is used to convey this information to a WebRTC client, but ffplay and ffmpeg need the SDP file for the information.

Here is a basic ffplay command to play an RTP stream:

% ffplay -hide_banner -loglevel error -protocol_whitelist "file,udp,rtp" -i streamdata.sdp

To create the SDP file, an extra option can be added to the ffmpeg command that is generating the stream:

  • -sdp_file streamdata.sdp — add to the ffmpeg stream-output command line

An SDP file is a simple ASCII file with individual parameters set on each line. Comments are indicated with a semicolon (';') character. Here is a sample SDP file with comments:

; SDP file for RTP stream
; This file describes the video stream parameters for the client.

v=0                      ; Version of the SDP protocol
o=- 0 0 IN IP4 127.0.0.1 ; Origin: session ID, version, and address type (IPv4 here)
s=My Stream              ; Session name (can be any string)
c=IN IP4 127.0.0.1       ; Connection information: IP address of the stream source
t=0 0                    ; Timing: start time and end time (0 means unlimited)
m=video 8004 RTP/AVP 96  ; Media description: video, port 8004, RTP with payload type 96
a=rtpmap:96 H264/90000   ; Attribute: payload type 96 corresponds to H.264 video at 90 kHz
a=framerate:25           ; Attribute: specifies the frame rate (25 FPS)

Using Janus to relay an A/V stream

To re-transmit a video stream (and/or an audio stream) as a WebRTC Web Real-Time Communication) connection, the Janus tool is one workable open-source solution. Details on building and configuring Janus are contained in that independent Wiki entry.

For our purposes, we'll assume that Janus is properly compiled and installed on the system, and furthermore that an appropriate Modules configuration has been created.

% module load janus
Janus version 1.2.4 loaded.
Janus configuration files are in $JANUS_DIR/etc/janus
% janus
Janus version: 1204 (1.2.4)
[...]
[h264-sample] New video stream! (#1, ssrc=1471769988, index 0)

Confirming Janus is operational

Once the Janus server is running, it is possible to start connecting both the incoming A/V stream, and the outgoing WebRTC stream. However, before doing that, it is possible to send commands to the Janus server to ensure that it is operational.

This can be done via URL in a browser, or sending commands via the curl command-line tool:

% curl -X POST http://localhost:7088/admin/streaming -H "Content-Type: application/json" -d '{ "janus": "info", "transaction": "MyRequest", "admin_secret": "pwd"}'

The result of both of these should be a longish JSON response, either in the browser window (in the case of providing the URL to the browser), or to the terminal shell (when using the curl method. This JSON output will include information about version information, the configuration of Janus and what transports and plugins are available.

If you want to see what ports Janus is using, use this command:

% ss -tunlp | grep janus

Playing the Janus WebRTC stream

Testing the Janus WebRTC stream

Before testing the WebRTC stream on a web page, ideally we can first confirm that the WebRTC stream is relayed through Janus using curl on the command line.

Presently, the details of how to accomplish a WebRTC connection via ffplay are being worked out, and will be added here once that is figured out.

Playing the Janus WebRTC stream in WebGL

The next step is to stream the WebRTC content into an HTML page, and I have developed a test application that streams both into an HTML canvas, as well as onto a WebGL texture map (using Three.JS).

There are a handful of Javascript libraries that will need to be included/imported into the HTML document:

  • webrtc-adapter — used by the Janus library
  • janus.js — for connecting to the Janus server
  • three.min.js — for the WebGL texturing

NOTE: In the near future, I will post the code for my HTML document.

Playing the Janus WebRTC stream in WebXR

(Coming Soon)

SSH Tunneling (when necessary)

Depending on firewalls, etc. between the stream-generating computer and the stream-consuming computer, there may be a need for tunneling ports through ssh.

Because there is video (and sometimes audio) streams, plus control signals, there are a handful of tunnels that will need to be created:

  • -L 8000:localhost:8000 — Forwards HTTP signaling traffic.
  • -L 8088:localhost:8088 — Forwards WebSocket signaling traffic.
  • -R 8004:localhost:8004 — Forwards RTP video traffic from the client to the server.
  • -R 8005:localhost:8005 — Forwards RTP audio traffic from the client to the server.

For example:

remote-computer% ssh -L 8000:localhost:8000 -L 8088:localhost:8088 -R 8004:localhost:8004 -R 8005:localhost:8005 janus-server-computer

Note that some (most) of the streams are actually UDP signals, and tunnelling is done with TCP connections, you may need to use the Linux socat tool (to "cat" data-streams through sockets). For example, to change one audio and one video stream from UDP to TCP on one computer, and then back to UDP on the "remote" computer, try this:

local-computer% socat -d -d UDP4-LISTEN:8004,fork TCP:127.0.0.1:8004
local-computer% socat -d -d UDP4-LISTEN:8005,fork TCP:127.0.0.1:8005

remote-computer% socat -d -d TCP4-LISTEN:8004,fork UDP:127.0.0.1:8004
remote-computer% socat -d -d TCP4-LISTEN:8005,fork UDP:127.0.0.1:8005

See Also

(Coming Soon)