webstats

Paper accepted to ICAT-EVGE!

Fei Wu will present the paper Comparison of Audio and Visual Cues to Support Remote Guidance in Immersive Environments at ICAT-EVGE 2020! Special congratulations to co-author Shreyas Chinnola, who was a high school student at the time this research was conducted. The conference will be held virtually on December 2-4.

Two papers accepted to ACM VRST!

The lab has two full papers accepted at ACM VRST 2020! Jerald Thomas will present the paper, Towards Physically Interactive Virtual Environments: Reactive Alignment with Redirected Walking. Evan Suma Rosenberg will present Capture to Rendering Pipeline for Generating Dynamically Relightable Virtual Objects with Handheld RGB-D Cameras. The conference will be held virtually Read more…

NSF grant awarded!

We have been awarded a $1.1 million grant from the National Science Foundation for the prediction, early detection, and mitigation of virtual reality motion sickness!  This project is an interdisciplinary collaboration with Victoria Interrante (Computer Science & Engineering) and Thomas Stoffregen (Kinesiology).

Five submissions accepted to IEEE VR!

The Illusioneering Lab has five submissions accepted at the IEEE VR 2019, the premier academic conference on virtual reality and 3D user interfaces! Jerald Thomas will present the paper, A General Reactive Algorithm for Redirected Walking using Artificial Potential Functions. Courtney Hutton will present Augmented Reality Interfaces for Semi-Autonomous Drones Read more…