Connecting FNIR to Vizard

 

Note: Currently tested and working with Vizard 6

 

Description from Biopac on FNIR: "An fNIR System provides researchers with real-time monitoring of tissue oxygenation in the brain as subjects take tests, perform tasks, view advertisements, experience ergonomic layouts, or receive stimulation. It allows researchers to quantitatively assess brain functions—such as attention, memory, planning, and problem solving—while individuals perform cognitive tasks. Monitor cognitive state of the subject in natural environments".

 

“Easily sync with stimuli presentation systems and integrate with other physiological and neurobehavioral measures that assess human brain activity, including eye tracking, pupil reflex, respiration, and electrodermal activity”.

 

After connecting the hardware for your FNIR you will need to start up COBI studio to stream the FNIR data to Vizard.

 

Under Data Source add your hardware

In Marker Source you can add markers that you can send to sync with events, such as key presses.

 

In the Network Broadcast section, add a TCP protocol broadcast and not the port number and host IP address.

 

 

Click on Prepare and make sure your hardware is set up and ready to go. Place the FNIR device under your headset and connect left and right side connections.

 

Click on “Experiment”. Here you can enter experiment information to catalog the data. If you are just checking the data or demoing the system, you can click “No Experiment”

 

Click on Record and then click the red triangle to begin recording your data

Next you will run a Vizard script to collect and save the data.

 

Vizard Code

 

Contact Worldviz for the “cobibiopac.py” module that is needed to send FNIR data to Vizard

 

Use the following code as an example to send data to Vizard. Make sure to place the cobibiopac.py module in the same folder as your code (or reference the path that it is located).

For further analysis of data use the FNIR soft software

 

How to Connect your Oculus Quest to Vizard and Vizible using the new Oculus Link Update

 

  1. Make sure that your Oculus software on your PC is up to date, as well as the firmware on the Quest itself.

  2. Turn on your Quest.

  3. Plug your USB 3.0 to USB-C cable into a USB 3.0 port on your PC, then plug the USB-C end into your headset (note: the cable that comes with the Quest will not work, so you should either purchase one off of Amazon (such as this one), or you can purchase the official Oculus Link cable from Oculus here.

  4. In Oculus, click on “Add Headset” and add the Quest, following the instructions that appear.

  5. Once plugged in, you’ll be prompted to enable Oculus Link (beta) on the Quest, select “Enable” to begin remote rendering via Oculus Link.

  6. Once connected you can run the VizBox/Oculus touch preset from vizconnect to connect it to Vizard, or connect in the same way that you do for Oculus. See this page in the Vizard Documentation for more details.  For Vizible, simply select the Oculus Preset from your choice of hardware.

Some things to consider when using Oculus Link to connect the Quest to your PC:

  • You may have some performance issues due to the compression methods used to deliver the content to the Quest

  • The Quest actually has a slightly higher resolution than the Rift S (1440x1600 per eye for the Quest vs. 1440x1280 per eye for the Rift S)

  • The Quest has a slightly lower refresh rate (72HZ vs. 80Hz for the Rift S)

  • The Rift S has 5 cameras for tracking, while the Quest has just 4

  • The Quest is just slightly heavier at 571 Grams, vs. 500 Grams for the Rift S

  • If having the issue where Oculus is saying you need to update the Quest drivers, but Quest is saying “no updates available”, make sure that you are not on the “public test channel” for Oculus.

How to setup the Vive Pro Eye

For more information on how you can use the Vive Pro Eye in your research and help with setting up studies, contact sales@worldviz.com

 

 

  • If you do not have SteamVR installed, you will need to do that first. Make a Steam account and download the Steam App from this link.

 

 

  • Plug in your headset and when prompted, install SteamVR

 

  • Run SteamVR and do a room setup

  • Download and install the Vive Pro Eye Eye Tracking software from this link 

 

  • Run SR_Runtime from the shortcut

 

  • Calibrate within SteamVR by clicking on the menu button on the Vive controller while wearing the headset. This should bring up the settings menu. From here click on the button that says “Vive Pro Eye”

 

  • If it is the first time you are running the Vive Pro Eye, you will need to agree to some terms and turn on eye tracking by clicking on the agreement that comes up in the headset.

 

  • Next, click “calibrate” and follow the instructions. To verify eye tracking is working,  open up Vizard and go to File- Quick Open and type in “Vivepro” and double click on the viveProEyeExample.py

 

  • This example allows you to verify that the eye tracking is working. When you look at the array of balls the ones you are looking at should change color. You can also toggle a gaze intersect by pressing the “g” key.

 

For more information on how you can use the Vive Pro Eye in your research and help with setting up studies, contact sales@worldviz.com

 

Connecting Vizard to BIOPAC AcqKnowledge

By Jeff Loomis

BIOPAC provides a large range of monitoring devices for detecting physical or physiological changes over time. The BIOPAC virtual reality platform provides:

  • Controlled and replicative experimental setups

  • Manipulation of the environment (and avatars) that would be impossible or prohibitively expensive in the real world

  • Synchronization of the events from the virtual world with physiological data

  • Biofeedback data transfer protocol for real-time digital communication, which allows the participant’s physiological feedback to control the VR world

Using physical data with Vizard

Vizard supports interaction with the data provided from these devices using the biopacndt.py module, a Python API for connecting to BIOPAC’s AcqKnowledge software with Network Data Transfer (NDT).

Note: biopacndt.py is included with the NDT license; contact support@biopac.com if needed.

The connection to AcqKnowledge can be made locally or over the network. As long as the Respond to auto-discovery requests option is enabled in AcqKnowledge there is no need to specify a remote machine’s IP address in Vizard.

If you have AcqKnowledge installed, the biopacndt.py file is found in the following location: C:\Program Files\BIOPAC Systems, Inc\AcqKnowledge\Network Data Transfer Examples\Python 2\source Copy and paste the file to the Vizard Python folder: (C:\Program Files\WorldViz\Vizard\python). Placing files in the Vizard Python folder will ensure each Vizard project can load them.

Software setup in AcqKnowledge

Enable the NDT Network Data Transfer protocol in AcqKnowledge in order to stream data to and from Vizard. (NDT is a licensed feature add-on.)

  1. In AcqKnowledge, go to Display > Preferences > Networking.

  2. Select the checkboxes for Enable network data transfer and Respond to auto-recovery requests.

  3. Restart the AcqKnowledge software to activate the Network Data Transfer.

This configuration only needs to be done once.

To see event markers in AcqKnowledge, have the events bar visible

You can also toggle to show the “Event Palette” to see a timeline of events.

Vizard to AcqKnowledge

In the code below, Vizard sends a signal to drop an event marker in AcqKnowledge:

AcqKnowledge to Vizard

In the following code, Vizard prints out the data streaming from AcqKnowledge:

How to Setup an Eye Tracking Experiment using the WorldViz VR Eye Tracking Analytics Lab

 

This article will introduce you to the most important features of the WorldViz VR Eye Tracking Analytics Lab and show you how you can easily modify the included template to use your own 3D assets. Once set up, you can collect eye tracking data related to your 3D assets such as number of fixations (as well as average and total fixation time), pupil diameter size, time stamps, gaze intersect visualization and position data, 3D gaze path visualizations, and more! For more information on this or any WorldViz products contact sales@worldviz.com

 

For a video guide of this tutorial click here

 

 

Getting Started and Setting Up Your Scene

 

When you first install the VR Eye Tracking Analytics Lab, you will have a folder with some template scripts you can use to build an eye tracking experiment, as well as folders to store your resources, saved recordings and data files. We will start first by loading your environment and fixation objects into the configuration tool, Vizard’s “Inspector”.

 

First, open up Inspector (in Vizard- under “Tools- Inspector”) and load your environment model by going to “File- Open”.

 

You can find 3D models off of websites like Sketchfab, use 3D modeling software, or various other sources. For more information on getting models out of Sketchfab see our blog post on this topic.

 

If your environment model already includes your fixation objects, then you can skip this step, otherwise, go to “File- Add” and add objects that you would like to measure eye tracking data on. You can then use the transform tools to position, rotate and scale the object(s).

 

 

Now you will need to save this model into the “resources” folder in the VR Eyetracking Analytics Lab folder by choosing “File-Save As”, navigating to that folder and then giving it a name.

 

 

Next, copy the name of your first object by right clicking on its name in the Scene Graph. You can also rename the object if you like by choosing “rename” from the context menu.

 

 

The next step is to open up the “eye_tracker_complete.py” script by either right clicking and choosing “edit” (if you have chosen Vizard as your Python editing tool), or opening Vizard and going to “File-Open” to load this script.

 

 

Since you copied the name of your object of interest in the last step, all you need to do now is to paste that name into the script where it says “gazeObject1” after “getChild”, overwriting the existing objects (the default template comes with a few objects already set up by default (a basketball, soccer ball and volleyball)) and then also in the location where you are labeling the gaze objects for the data files (line 30). If you wish to add more or less than the included three you just need to either remove or add more lines to the gazeObject and gazeObjectsDict sections.

 

 

 

Repeat this step for all the objects you wish to collect and visualize eye tracking data with (going back to Inspector, right clicking to copy the name of the object and pasting the name into the gazeObject list and “gazeObjectsDict”).

 

 

Now rename the “env” model to be the name of your saved environment.

 

If you wish to change the fixation threshold you can change this on Line 35 next to “threshold”. The default is set to 500 milliseconds (meaning if you stare at an object for more than 500 milliseconds it will register as a fixation).

 

 

And that’s it! You now have your eye tracking experiment setup to run in your eye tracker of choice and collect data and visualize it.

 

Choosing your Hardware and Preferences

 

To choose your hardware open the “config” file in Vizard and change the name of device from either the Vive Pro Eye, Pupil Labs, Tobii VR (for original Tobii Vive integration), or a desktop mode that simulates an eye tracker (if you don’t have access to one or wish to just prototype on a desktop). You can also choose whether you want to save a video recording of your experiment in the config file.

 

 

Running the Experiment

 

To run the experiment, just click the green arrow in the Vizard interface, press the “F5” key or go to “Script- Run”. When it first opens a user will be prompted to enter in their name and ID number and click “submit” (this is so that you can more easily identify each user’s data, if you wish to skip this you can just click “submit”).

 

 

Put on the headset and press “spacebar” to begin recording

 

 

As your experiment is running you will see a gaze point intersect on the mirrored desktop view. If you wish to have the participant see this, you can toggle it on with the “P” key. It is set to only be on the mirrored view by default. The mirrored view will also display the elapsed time of the experiment as well as the number of fixations. You can easily add more information to this viewpoint.

 

 

Analyzing the Data

 

When the experiment has finished you can now view and analyze the data. Here are a few of the ways:

 

3D gaze path: This can both be viewed on the desktop, or also in a headset (so you can walk around and see it in 3D).

 

 

Bar Chart of Fixation views per object, total view time and average fixation time:

 

Experiment data text file showing a timeline of when each object was fixated on, as well as a summary of fixations.

 

 

Tracking data showing a timestamp along with the x,y,z position of the gaze intersect point and pupil diameter. This can also be changed to a .csv file so you can use a spreadsheet editor to calculate things like average pupil diameter, etc.

 

 

 

Video Recording Playback

 

After you’ve run your experiment there will be a video recording saved in the “recordings” folder (if you’ve selected this option in the config file). This will allow you to review the experiment while seeing the gaze intersect point and a time stamp. Using this, you can do such things as match what a user was looking at in the scene with a metric such as pupil diameter.

 

In addition you can add  your own, custom events (such as gaze based interactions, etc.) by placing custom code inside the following functions
onGazeBegin for your event to be triggered as soon as a gazeObject is seen.
onGazeEnd to trigger something to happen when a user’s gaze point is no longer intersecting with that object.
onGazeTime to trigger something that happens only when an object is in fixation.

 

Spherical video and images

 

You can also use 360 videos and images for your experiment (both monoscopic and stereoscopic) instead of 3D models.

 

 

Heatmaps

 

For visualizing heatmaps you need to use the “heatmap” script found in the folder of the same name. From here you can swap out the environment or use a 360 video or image to view a heatmap overlay.

 

So that’s a really quick overview of setting up an eye tracking experiment using the VR Eye Tracking Analytics Lab. In addition to the main script there are also smaller scripts that highlight specific features, such as 3D data visualizations, etc. as well as a demo tailored to using a library of 360 videos and images.

 

For more information on our Eye Tracking Analytics Lab or any WorldViz products, please contact sales@worldviz.com

How to set up the Oculus Rift S

Here are some quick instructions on how to get your Rift S setup and going with Vizard and Vizible.

 

  1. Download and install the Oculus app from www.oculus.com/setup

 

  1. Plug in the USB cable to a USB 3.0 port

  2. Plug in the Display port cable to a display port output on your video card (you can also use a mini-display adaptor if you only have mini-display port)

 

  1. In the Oculus App, click on “Devices”

 

  1. Select your Rift S (it should automatically show up once connected) and select “Device Setup”

 

  1. Put a battery in each controller and follow the instructions in the Oculus app

 

  1. Hold the menu and “Y” buttons to pair the left controller until the light stops blinking. For right controller hold Oculus and “B” buttons

 

  1. Put on HMD. You can adjust the fit with the fit wheel on the back of the HMD and the velcro strap on top. You can also adjust the depth with the depth adjustment button on the bottom right of the headset.

  1. Determine which direction you will be facing in VR

 

  1. Place one of the controllers on the ground to confirm the ground plane

 

  1. Define play area to calculate the Guardian boundary and your play area. Point a controller at the ground and trace an area on the ground around your center point. This can be skipped if you are not going to be walking around.

 

  1. IPD (interpupillary distance) is adjusted via software with the Rift S. To do this click on Device settings- IPD adjustment

 

  1. Connecting to Vizard using vizconnect:

    1. Vizard includes an Oculus preset that you can then add to any scene. To do this open up Vizard and go to Tools- vizconnect

    2. Choose to save the file to a folder where you have your main Vizard script (can name it anything you want, but suggest “vizconnect_config”)

    3. Click on “Select from some common preset configurations”

    4. Choose the “Oculus” preset

    5. Your settings are automatically saved, so you can now close the vizconnect tool

    6. To add to your Vizard scene, add the following code to your Vizard script:

 

  1. Connecting to Oculus directly in the code in Vizard:
    If you wish to connect to the Oculus HMD directly in your code, add the following lines of code to your Vizard script:

 

  1. Connecting to Vizible

    1. To connect to Vizible, simply choose the “Oculus Touch” preset from the list of dropdown presets in the client interface (Note: that the “Oculus Touch” preset will still recognize the Rift S controllers).

 

Care of your Oculus Rift HMD

  • Keep away from direct sunlight

  • Clean lenses with a dry, microfiber cloth

Vivetracker Setup in Vizard

With the use of the HTC Vivetrackers you can easily setup full body tracking in Vizard. Now fully integrated in Vizard 6.

For adding the trackers, you will also need some straps to hold the trackers on, such as these.

Read more

Manus VR Prime 2 Gloves Setup

For more information on using the Manus VR Gloves with Vizard, see the following Vizard documentation page

For more information on the Manus VR data gloves visit their website at https://www.manus-vr.com/

To access the latest version of Manus Core (the software needed for running the Manus gloves) sign up and login to the Manus resource center here

More information available in the Vizard Documentation here

Vizconnect files:

Manus gloves with Vivetrackers and articulated glove models

(this is a default setup that is most commonly used)

Note that at times the offset may need to be changed for the hand. This is usually the “pre pitch” and might need to be set to “90” or “0”. This can be accessed by opening the vizconnect file, clicking on “go directly to advanced options”, then on the “trackers” tab clicking on the “offsets” button

Read more

Orientation Drift Correction with PPT’s Optical Heading Plug-in

The optical heading plug-in is used in the PPT Studio software to correct yaw (and role) of an inertia cube (IC) using two markers. The two markers have to be attached to the same structure as the IC. Once initialized the markers act as an optical compass.
Read more

Sensory Latency and glFinish

This article discusses the factors that contribute to overall sensor latency and the importance of turning on glFinish in the Vizard script to reduce render latency.
Read more

Next Page »