With the use of Kinect sensors, an Oculus Rift, and a ton of experimenting, two individuals found themselves strangely having an out-of-body virtual reality experience.
We chatted with creator Tobias Gremmler who is a visiting Associate Professor in the School of Creative Media at City University of Hong Kong about their latest experiment. Alongside student Adam Zeke (pictured), they set out to explore new interaction scenarios that are going beyond the normal game experience.
In the latest video uploaded by Gremmler, it shows one of many sessions where they are translating a real world environment into a virtual point cloud (by using two Kinect sensors) and viewed through an Oculus Rift VR headset. The red and white point clouds shown in the video distinguish the two Kinect sensors.
Observing Your Own Body Walking Away
The most interesting part of the experiment is around the 1:00 minute mark. Around this time, Zeke moves outside the range of the Oculus headset positional tracker and begins his out-of-body experience. The motion of the virtual camera that feeds visuals to the Oculus remains at his last position (while the rotation tracking is still working). While Zeke continues moving forward in real space, his VR point of view stays in place. He observes his own body walking away from himself.
Gremmler explains that they have reconstructed this situation several times and that “it is quite an intense experience, especially if one was wearing the headset for a longer time and gets used to the virtual representation.”
Perceiving Visual Information
Gremmler explains to us that they “wanted to explore how visual perception and cognition adapts to such situations, especially if the geometrical and spacial features of the virtual environment and the self-representation reach a certain degree of abstraction.”
Often we perceive visual information as more detailed the closer we get to it, but in the case of this latest experiment, the environment is initially perceived by digital sensors and then distributed to the headset. Thus the perceived information and digital resolution depends on the point of view of the environmental sensors rather than our eyes.
Gremmler is working on a number of additional experiments that include translating bodies into abstract graphics and virtual representations of real-world space.
We will continue to follow his experimentation in VR and you can also follow his progress at http://www.syncon-d.com