Arduino Support in Vizard

By Jeff Loomis and Sado Rabaudi

Arduino uses its own language which is similar to C++. Instead, Python can be used and the easiest way to do that is having the board and Python communicate with the Firmata protocol. pyfirmata installs directly in the package manager. There's a lot of information on working with Arduino in this article:

https://realpython.com/arduino-python/

Use the following steps to use Arduino with Vizard:

  1. Install Arduino IDE

  1. Upload the Firmata sketch to the arduino board (File- Examples-  Firmata -StandardFirmata). See the section called Uploading the Firmata Sketch

  1. The sketch will be loaded into a new IDE window. To upload it to the Arduino:

  • Plug the USB cable into the PC.

  • Select the appropriate board (Tools- Board. For Arduino Uno, you should select Arduino/Genuino Uno)

  • Select port (under Tools- Port)

  • Press Upload

  1. Next, make sure that pyFirmata is installed from the package manager by going to Vizard and choosing Tools- Package Manager- Search and then searching “pyfirmata”

  1. After that you should be able to run pyfirmata/python code in Vizard to communicate with the device. There's examples in the realpython article for communication over both digital and analog pins.

Vizard Sample Code:

Converting videos with FFMPEG

FFMPEG is a free and open-source project for handling video and other media types that allows a user to modify things such as resolution, file type, sample size and more.

FFMPEG Conversion Steps:

  1. Download FFMpeg from here: http://ffmpeg.zeranoe.com/builds

  2. Once downloaded, unzip and navigate to the “bin” folder

  3. Make sure the video file you want to work with is in the same folder as the ffmpeg.exe file (bin folder)

  4. Open a command line window and change directory to the location of where your FFMPEG bin file is by typing “cd” (without quotes) followed by a space and then the address of the folder (may have to place on the “C” drive)

  5. In the command window type:

“ffmpeg -i InputFileName.mp4 -f avi -c:v wmv2 -s 4096x1536 -b:v 15000k outputFilelName.avi”

Where “”InputFileName” is the name of your file you are converting, and “outputFileName” is the name of your output file

“15000k” is the bit rate

And “4096x1536” is the resolution size

Connecting Vizard to BIOPAC AcqKnowledge

By Jeff Loomis

BIOPAC provides a large range of monitoring devices for detecting physical or physiological changes over time. The BIOPAC virtual reality platform provides:

  • Controlled and replicative experimental setups

  • Manipulation of the environment (and avatars) that would be impossible or prohibitively expensive in the real world

  • Synchronization of the events from the virtual world with physiological data

  • Biofeedback data transfer protocol for real-time digital communication, which allows the participant’s physiological feedback to control the VR world

Using physical data with Vizard

Vizard supports interaction with the data provided from these devices using the biopacndt.py module, a Python API for connecting to BIOPAC’s AcqKnowledge software with Network Data Transfer (NDT).

Note: biopacndt.py is included with the NDT license; contact support@biopac.com if needed.

The connection to AcqKnowledge can be made locally or over the network. As long as the Respond to auto-discovery requests option is enabled in AcqKnowledge there is no need to specify a remote machine’s IP address in Vizard.

If you have AcqKnowledge installed, the biopacndt.py file is found in the following location: C:\Program Files\BIOPAC Systems, Inc\AcqKnowledge\Network Data Transfer Examples\Python 2\source Copy and paste the file to the Vizard Python folder: (C:\Program Files\WorldViz\Vizard\python). Placing files in the Vizard Python folder will ensure each Vizard project can load them.

Software setup in AcqKnowledge

Enable the NDT Network Data Transfer protocol in AcqKnowledge in order to stream data to and from Vizard. (NDT is a licensed feature add-on.)

  1. In AcqKnowledge, go to Display > Preferences > Networking.

  2. Select the checkboxes for Enable network data transfer and Respond to auto-recovery requests.

  3. Restart the AcqKnowledge software to activate the Network Data Transfer.

This configuration only needs to be done once.

To see event markers in AcqKnowledge, have the events bar visible

You can also toggle to show the “Event Palette” to see a timeline of events.

Vizard to AcqKnowledge

In the code below, Vizard sends a signal to drop an event marker in AcqKnowledge:

AcqKnowledge to Vizard

In the following code, Vizard prints out the data streaming from AcqKnowledge:

How to Setup an Eye Tracking Experiment using the WorldViz VR Eye Tracking Analytics Lab

 

This article will introduce you to the most important features of the WorldViz VR Eye Tracking Analytics Lab and show you how you can easily modify the included template to use your own 3D assets. Once set up, you can collect eye tracking data related to your 3D assets such as number of fixations (as well as average and total fixation time), pupil diameter size, time stamps, gaze intersect visualization and position data, 3D gaze path visualizations, and more! For more information on this or any WorldViz products contact sales@worldviz.com

 

For a video guide of this tutorial click here

 

 

Getting Started and Setting Up Your Scene

 

When you first install the VR Eye Tracking Analytics Lab, you will have a folder with some template scripts you can use to build an eye tracking experiment, as well as folders to store your resources, saved recordings and data files. We will start first by loading your environment and fixation objects into the configuration tool, Vizard’s “Inspector”.

 

First, open up Inspector (in Vizard- under “Tools- Inspector”) and load your environment model by going to “File- Open”.

 

You can find 3D models off of websites like Sketchfab, use 3D modeling software, or various other sources. For more information on getting models out of Sketchfab see our blog post on this topic.

 

If your environment model already includes your fixation objects, then you can skip this step, otherwise, go to “File- Add” and add objects that you would like to measure eye tracking data on. You can then use the transform tools to position, rotate and scale the object(s).

 

 

Now you will need to save this model into the “resources” folder in the VR Eyetracking Analytics Lab folder by choosing “File-Save As”, navigating to that folder and then giving it a name.

 

 

Next, copy the name of your first object by right clicking on its name in the Scene Graph. You can also rename the object if you like by choosing “rename” from the context menu.

 

 

The next step is to open up the “eye_tracker_complete.py” script by either right clicking and choosing “edit” (if you have chosen Vizard as your Python editing tool), or opening Vizard and going to “File-Open” to load this script.

 

 

Since you copied the name of your object of interest in the last step, all you need to do now is to paste that name into the script where it says “gazeObject1” after “getChild”, overwriting the existing objects (the default template comes with a few objects already set up by default (a basketball, soccer ball and volleyball)) and then also in the location where you are labeling the gaze objects for the data files (line 30). If you wish to add more or less than the included three you just need to either remove or add more lines to the gazeObject and gazeObjectsDict sections.

 

 

 

Repeat this step for all the objects you wish to collect and visualize eye tracking data with (going back to Inspector, right clicking to copy the name of the object and pasting the name into the gazeObject list and “gazeObjectsDict”).

 

 

Now rename the “env” model to be the name of your saved environment.

 

If you wish to change the fixation threshold you can change this on Line 35 next to “threshold”. The default is set to 500 milliseconds (meaning if you stare at an object for more than 500 milliseconds it will register as a fixation).

 

 

And that’s it! You now have your eye tracking experiment setup to run in your eye tracker of choice and collect data and visualize it.

 

Choosing your Hardware and Preferences

 

To choose your hardware open the “config” file in Vizard and change the name of device from either the Vive Pro Eye, Pupil Labs, Tobii VR (for original Tobii Vive integration), or a desktop mode that simulates an eye tracker (if you don’t have access to one or wish to just prototype on a desktop). You can also choose whether you want to save a video recording of your experiment in the config file.

 

 

Running the Experiment

 

To run the experiment, just click the green arrow in the Vizard interface, press the “F5” key or go to “Script- Run”. When it first opens a user will be prompted to enter in their name and ID number and click “submit” (this is so that you can more easily identify each user’s data, if you wish to skip this you can just click “submit”).

 

 

Put on the headset and press “spacebar” to begin recording

 

 

As your experiment is running you will see a gaze point intersect on the mirrored desktop view. If you wish to have the participant see this, you can toggle it on with the “P” key. It is set to only be on the mirrored view by default. The mirrored view will also display the elapsed time of the experiment as well as the number of fixations. You can easily add more information to this viewpoint.

 

 

Analyzing the Data

 

When the experiment has finished you can now view and analyze the data. Here are a few of the ways:

 

3D gaze path: This can both be viewed on the desktop, or also in a headset (so you can walk around and see it in 3D).

 

 

Bar Chart of Fixation views per object, total view time and average fixation time:

 

Experiment data text file showing a timeline of when each object was fixated on, as well as a summary of fixations.

 

 

Tracking data showing a timestamp along with the x,y,z position of the gaze intersect point and pupil diameter. This can also be changed to a .csv file so you can use a spreadsheet editor to calculate things like average pupil diameter, etc.

 

 

 

Video Recording Playback

 

After you’ve run your experiment there will be a video recording saved in the “recordings” folder (if you’ve selected this option in the config file). This will allow you to review the experiment while seeing the gaze intersect point and a time stamp. Using this, you can do such things as match what a user was looking at in the scene with a metric such as pupil diameter.

 

In addition you can add  your own, custom events (such as gaze based interactions, etc.) by placing custom code inside the following functions
onGazeBegin for your event to be triggered as soon as a gazeObject is seen.
onGazeEnd to trigger something to happen when a user’s gaze point is no longer intersecting with that object.
onGazeTime to trigger something that happens only when an object is in fixation.

 

Spherical video and images

 

You can also use 360 videos and images for your experiment (both monoscopic and stereoscopic) instead of 3D models.

 

 

Heatmaps

 

For visualizing heatmaps you need to use the “heatmap” script found in the folder of the same name. From here you can swap out the environment or use a 360 video or image to view a heatmap overlay.

 

So that’s a really quick overview of setting up an eye tracking experiment using the VR Eye Tracking Analytics Lab. In addition to the main script there are also smaller scripts that highlight specific features, such as 3D data visualizations, etc. as well as a demo tailored to using a library of 360 videos and images.

 

For more information on our Eye Tracking Analytics Lab or any WorldViz products, please contact sales@worldviz.com

PyQt

PyQt is a Python wrapper around the cross-platform Qt application framework. This post describes how to install PyQt with Vizard. It also includes a sample script showing how to embed the Vizard graphics window inside a PyQt application.
Read more

Using Excel Files in Vizard

With the xlrd and xlwt Python Addon libraries you can easily read and write directly to Excel files (.xls) from Vizard. For complete documentation and examples on using these libraries go to http://www.python-excel.org/.
Read more

Voice recognition

This article describes how to perform voice recognition in Python using the 3rd party pywin32 library along with the Microsoft Speech SDK.
Read more

NetBeans IDE + Vizard

NetBeans IDE is a free cross-platform IDE that supports many programming languages, including Python. This article describes how to configure NetBeans IDE to run Vizard scripts.
Read more

Wing IDE + Vizard

Wing IDE is a commercial cross-platform Python IDE. This article describes how to configure Wing IDE to run Vizard scripts.
Read more

Komodo IDE + Vizard

Komodo is a cross-platform IDE that supports Python. It comes in a free and commercial version. This article describes how to configure Komodo to run Vizard scripts.
Read more

Next Page »