Arduino Support in Vizard

By Jeff Loomis and Sado Rabaudi

Arduino uses its own language which is similar to C++. Instead, Python can be used and the easiest way to do that is having the board and Python communicate with the Firmata protocol. pyfirmata installs directly in the package manager. There's a lot of information on working with Arduino in this article:

https://realpython.com/arduino-python/

Use the following steps to use Arduino with Vizard:

  1. Install Arduino IDE

  1. Upload the Firmata sketch to the arduino board (File- Examples-  Firmata -StandardFirmata). See the section called Uploading the Firmata Sketch

  1. The sketch will be loaded into a new IDE window. To upload it to the Arduino:

  • Plug the USB cable into the PC.

  • Select the appropriate board (Tools- Board. For Arduino Uno, you should select Arduino/Genuino Uno)

  • Select port (under Tools- Port)

  • Press Upload

  1. Next, make sure that pyFirmata is installed from the package manager by going to Vizard and choosing Tools- Package Manager- Search and then searching “pyfirmata”

  1. After that you should be able to run pyfirmata/python code in Vizard to communicate with the device. There's examples in the realpython article for communication over both digital and analog pins.

Vizard Sample Code:

How to use the Arc Teleport in Vizard

Vizard 7 now includes the ability to move through VR using the “arc teleport” method, as well as “snap rotation” for rotating. This is an effective way to navigate large spaces while eliminating any motion sickness associated with smooth movement using a controller. If a target is valid, you will see a green “arc” project out with a platform showing where you will teleport to. If a target is not valid (such as the side of a wall), the arc will be red and no platform will be shown.

Presets

There are presets that you can quickly choose from to allow this to be used with either the Oculus line of headsets, Vive/Steam VR line (which includes the Vive Pro, Vive Cosmos and more), as well as the WinMR line of headsets.

 

For Vive and SteamVR headsets, Use RH Trackpad to teleport (click once to place, and release to teleport) and LH Trackpad for smooth locomotion. For Oculus headsets, use RH 'B' button to teleport and LH Stick for smooth locomotion.

 

Setting Up Manually

To add an arc teleport manually, follow these steps:

 

Choose the “Advanced Configuration” menu

 

 

From the “Transports” tab, choose “Virtual- Arc Teleport”

 

 

Click “Add a group” under “arc source node”, give it a name if you like and then select that group to be the arc source node. Can also adjust the range of the arc and, if you’re using multiple transports, choose a transportation group.

 

 

 

In the vizconnect “Scene Graph” drag the group to be under the hand or object you want the arc to project from. In this case, we’re using the right hand.

 

 

Then, drag the avatar to be under the arc teleport transport you added

 

 

Lastly, click on “Mappings” and choose which button you want to initiate the arc teleport. To have the snap button be used to both place and teleport, use the “while” and “while not” options

 

 

You can now use the arc teleport in any scene you apply this vizconnect file to

 

 

Snap Rotation

 

For a vizconnect with added “snap” rotations, download a premade vizconnect file here (for the Vive/SteamVR preset).

Vive/SteamVR

 

You can modify which buttons are used to rotate under the “Events” tab under “Mappings”

 

To set up snap rotation manually, follow these steps:

 

Add a custom event under the “Events” tab and name one “Snap_Left” and a second one named “Snap_Right”

 

Under “Mappings” choose which buttons you want to use to rotate

 

Add this code in the postinit section of your vizconnect file by opening the vizconnect directly in Vizard and editing it:

 

def turnLeft(e):

    vizconnect.getRawTransport('main_transport').snapLeft()

    viz.callback(viz.getEventID('SNAP_LEFT'), turnLeft)

def turnRight(e):

    vizconnect.getRawTransport('main_transport').snapRight()

    viz.callback(viz.getEventID('SNAP_RIGHT'), turnRight)

 

Undoing a Teleport

 

Using a similar method to adding the events for snap rotation, create an event to use the undo teleport method, set your mappings and add this code to your postinit section:

 

def UndoTeleport(e):

    vizconnect.getRawTransport('main_transport').undoTeleport()

    viz.callback(viz.getEventID('UNDO_TELEPORT'), UndoTeleport)

 

For additional modifications and questions, see the Vizard documentation or contact support@worldviz.com

 

For more information on any of Worldviz products contact sales@worldviz.com

 

Connecting Vizard to BIOPAC AcqKnowledge

By Jeff Loomis

BIOPAC provides a large range of monitoring devices for detecting physical or physiological changes over time. The BIOPAC virtual reality platform provides:

  • Controlled and replicative experimental setups

  • Manipulation of the environment (and avatars) that would be impossible or prohibitively expensive in the real world

  • Synchronization of the events from the virtual world with physiological data

  • Biofeedback data transfer protocol for real-time digital communication, which allows the participant’s physiological feedback to control the VR world

Using physical data with Vizard

Vizard supports interaction with the data provided from these devices using the biopacndt.py module, a Python API for connecting to BIOPAC’s AcqKnowledge software with Network Data Transfer (NDT).

Note: biopacndt.py is included with the NDT license; contact support@biopac.com if needed.

The connection to AcqKnowledge can be made locally or over the network. As long as the Respond to auto-discovery requests option is enabled in AcqKnowledge there is no need to specify a remote machine’s IP address in Vizard.

If you have AcqKnowledge installed, the biopacndt.py file is found in the following location: C:\Program Files\BIOPAC Systems, Inc\AcqKnowledge\Network Data Transfer Examples\Python 2\source Copy and paste the file to the Vizard Python folder: (C:\Program Files\WorldViz\Vizard\python). Placing files in the Vizard Python folder will ensure each Vizard project can load them.

Software setup in AcqKnowledge

Enable the NDT Network Data Transfer protocol in AcqKnowledge in order to stream data to and from Vizard. (NDT is a licensed feature add-on.)

  1. In AcqKnowledge, go to Display > Preferences > Networking.

  2. Select the checkboxes for Enable network data transfer and Respond to auto-recovery requests.

  3. Restart the AcqKnowledge software to activate the Network Data Transfer.

This configuration only needs to be done once.

To see event markers in AcqKnowledge, have the events bar visible

You can also toggle to show the “Event Palette” to see a timeline of events.

Vizard to AcqKnowledge

In the code below, Vizard sends a signal to drop an event marker in AcqKnowledge:

AcqKnowledge to Vizard

In the following code, Vizard prints out the data streaming from AcqKnowledge:

How to Setup an Eye Tracking Experiment using the WorldViz VR Eye Tracking Analytics Lab

 

This article will introduce you to the most important features of the WorldViz VR Eye Tracking Analytics Lab and show you how you can easily modify the included template to use your own 3D assets. Once set up, you can collect eye tracking data related to your 3D assets such as number of fixations (as well as average and total fixation time), pupil diameter size, time stamps, gaze intersect visualization and position data, 3D gaze path visualizations, and more! For more information on this or any WorldViz products contact sales@worldviz.com

 

For a video guide of this tutorial click here

 

 

Getting Started and Setting Up Your Scene

 

When you first install the VR Eye Tracking Analytics Lab, you will have a folder with some template scripts you can use to build an eye tracking experiment, as well as folders to store your resources, saved recordings and data files. We will start first by loading your environment and fixation objects into the configuration tool, Vizard’s “Inspector”.

 

First, open up Inspector (in Vizard- under “Tools- Inspector”) and load your environment model by going to “File- Open”.

 

You can find 3D models off of websites like Sketchfab, use 3D modeling software, or various other sources. For more information on getting models out of Sketchfab see our blog post on this topic.

 

If your environment model already includes your fixation objects, then you can skip this step, otherwise, go to “File- Add” and add objects that you would like to measure eye tracking data on. You can then use the transform tools to position, rotate and scale the object(s).

 

 

Now you will need to save this model into the “resources” folder in the VR Eyetracking Analytics Lab folder by choosing “File-Save As”, navigating to that folder and then giving it a name.

 

 

Next, copy the name of your first object by right clicking on its name in the Scene Graph. You can also rename the object if you like by choosing “rename” from the context menu.

 

 

The next step is to open up the “eye_tracker_complete.py” script by either right clicking and choosing “edit” (if you have chosen Vizard as your Python editing tool), or opening Vizard and going to “File-Open” to load this script.

 

 

Since you copied the name of your object of interest in the last step, all you need to do now is to paste that name into the script where it says “gazeObject1” after “getChild”, overwriting the existing objects (the default template comes with a few objects already set up by default (a basketball, soccer ball and volleyball)) and then also in the location where you are labeling the gaze objects for the data files (line 30). If you wish to add more or less than the included three you just need to either remove or add more lines to the gazeObject and gazeObjectsDict sections.

 

 

 

Repeat this step for all the objects you wish to collect and visualize eye tracking data with (going back to Inspector, right clicking to copy the name of the object and pasting the name into the gazeObject list and “gazeObjectsDict”).

 

 

Now rename the “env” model to be the name of your saved environment.

 

If you wish to change the fixation threshold you can change this on Line 35 next to “threshold”. The default is set to 500 milliseconds (meaning if you stare at an object for more than 500 milliseconds it will register as a fixation).

 

 

And that’s it! You now have your eye tracking experiment setup to run in your eye tracker of choice and collect data and visualize it.

 

Choosing your Hardware and Preferences

 

To choose your hardware open the “config” file in Vizard and change the name of device from either the Vive Pro Eye, Pupil Labs, Tobii VR (for original Tobii Vive integration), or a desktop mode that simulates an eye tracker (if you don’t have access to one or wish to just prototype on a desktop). You can also choose whether you want to save a video recording of your experiment in the config file.

 

 

Running the Experiment

 

To run the experiment, just click the green arrow in the Vizard interface, press the “F5” key or go to “Script- Run”. When it first opens a user will be prompted to enter in their name and ID number and click “submit” (this is so that you can more easily identify each user’s data, if you wish to skip this you can just click “submit”).

 

 

Put on the headset and press “spacebar” to begin recording

 

 

As your experiment is running you will see a gaze point intersect on the mirrored desktop view. If you wish to have the participant see this, you can toggle it on with the “P” key. It is set to only be on the mirrored view by default. The mirrored view will also display the elapsed time of the experiment as well as the number of fixations. You can easily add more information to this viewpoint.

 

 

Analyzing the Data

 

When the experiment has finished you can now view and analyze the data. Here are a few of the ways:

 

3D gaze path: This can both be viewed on the desktop, or also in a headset (so you can walk around and see it in 3D).

 

 

Bar Chart of Fixation views per object, total view time and average fixation time:

 

Experiment data text file showing a timeline of when each object was fixated on, as well as a summary of fixations.

 

 

Tracking data showing a timestamp along with the x,y,z position of the gaze intersect point and pupil diameter. This can also be changed to a .csv file so you can use a spreadsheet editor to calculate things like average pupil diameter, etc.

 

 

 

Video Recording Playback

 

After you’ve run your experiment there will be a video recording saved in the “recordings” folder (if you’ve selected this option in the config file). This will allow you to review the experiment while seeing the gaze intersect point and a time stamp. Using this, you can do such things as match what a user was looking at in the scene with a metric such as pupil diameter.

 

In addition you can add  your own, custom events (such as gaze based interactions, etc.) by placing custom code inside the following functions
onGazeBegin for your event to be triggered as soon as a gazeObject is seen.
onGazeEnd to trigger something to happen when a user’s gaze point is no longer intersecting with that object.
onGazeTime to trigger something that happens only when an object is in fixation.

 

Spherical video and images

 

You can also use 360 videos and images for your experiment (both monoscopic and stereoscopic) instead of 3D models.

 

 

Heatmaps

 

For visualizing heatmaps you need to use the “heatmap” script found in the folder of the same name. From here you can swap out the environment or use a 360 video or image to view a heatmap overlay.

 

So that’s a really quick overview of setting up an eye tracking experiment using the VR Eye Tracking Analytics Lab. In addition to the main script there are also smaller scripts that highlight specific features, such as 3D data visualizations, etc. as well as a demo tailored to using a library of 360 videos and images.

 

For more information on our Eye Tracking Analytics Lab or any WorldViz products, please contact sales@worldviz.com

Controlling Imported Animantions

The following types of animations can be exported with OSG models from 3ds Max and played in Vizard:

  • Transform:  Animations based on translation, rotation, and scale
  • UV: Animations based on changing texture coordinates
  • OSG sequence: Mesh animations that cycle through a series of saved meshes, one for each frame

Read more

Underwater Effect

The example script in this article simulates the type of visual distortion that occurs in an underwater scene. It uses a custom post-process effect created using Vizard's vizfx.postprocess library and GLSL (OpenGL shading language).
Read more

Launching scripts from the command line

This article describes how to launch Vizard scripts using the command line. Running scripts from the command line allows you automate the execution of your scripts using 3rd party apps. Using the command line also enables you to pass arguments to your script, allowing you to change the behavior without modifying the script.
Read more

Introduction to the ARToolKit plug-in

This plug-in provides support for the ARToolKit Professional library by ARToolworks. It allows for the tracking of image and matrix based markers using a standard webcam. Vizard can then render models and combine them with video captured from the webcam. Vizard uses the position and orientation of the markers for placing these models in the AR scene. The video tutorial in this article shows a short demo of this in action and then steps through the basic code for tracking a marker and linking a model to it.
Read more

Lens Correction

Vizard can perform lens correction on the output image to account for distorted displays, such as HMDs. The video below shows a scene with and without lens correction.
Read more

Embedding resources in scripts

This article describes how to embed certain texture and model resources within your script. Using this technique allows you to distribute or move your script to different locations without needing to worry about copying any resources along with it.

Read more

Next Page »