Engine

Unreal Engine 4

Project Type

School/Personal Project

Duration

10 months

Team Size

4 people

Curent status

No in development

Hello VR

*note, I currently have very few visual from this project as I used a computer provided by the University of Waterloo Games Institute while working on this project and I can not open the project on my home computer at this time.

This project is a Virtual Reality (VR) communication system build for my capstone design project during my final year at Waterloo.

This project is similar to Facebook Spaces, albeit with more sensors so more of the body language is captured. We started work on this project about 5 months before Spaces was announced (much to our pleasure, we no longer had to persuade people this was a promising idea!).

  • Driving the character model animation from the sensor data (cleaning up data), in addition to the animations in absence of sensor data (i.e. when the hands stopped tracking, eye movement, etc)
  • Adding interaction with objects in the world such as picking things up
  • Part of the networking code
  • Game logic required for setting up the animation with a new user
  • Creating the avatars, environment, etc

The majority of the design, research and other ‘academic things’ were completed during the first 4 months of the project while the actual creation of the software was completed during the final 2-3 months.

Experiment on measuring one’s sense of self

Recently, I worked with a professor at the Games Institute at uWaterloo to retrofit Hello VR for use as an tool in an experiment used to test ‘one’s sense of self’. While my capstone design team all contributed to the original project, I performed all the changes required to transform the original project to support this experiment.

The setup for the experiment was fairly simple, initially, the subject created an avatar using Adobe Fuse. At this point, the experimenters then created a gender swapped version of the subject’s avatar, rigged the models using Miximo and imported them into the project. Then, after a brief calibration phase which was run automatically by the software, the subject was presented two models and they had to specify which model they ‘most associated with’. All models fully tracked the user’s motion including finger motion. There were two possible models (normal and gender swapped) and two possible animations (fully tracked and fully tracked with an idle animation added to the subject’s motion) and each test was a permutation of each possibility.

I performed the following work to transitions HelloVR for the experiment:

  • Created experiment game logic to allow the experiment to run itself and export results with minimal human input from experimenters
  • Simplify character model import process
  • Backend work to allow Korean localisation of all text

The initial experiment was run in August 2017 and the results of the experiment are set to be published in a scientific journal down the road.