Tracking in VR: what is it, what is it necessary for and what variants are there?
Keeping track of motion in the physical world is a crucial part of any VR system. Tracking was one of the largest obstacles to bringing VR headsets into consumer electronics, and it will remain a major challenge due to our desire to expand and improve VR experiences. Highly accurate tracking methods have been mostly enabled by commodity hardware components, such as inertial measurement units (IMUs) and cameras, that have plummeted in size and cost due to the smartphone industry. Three categories of tracking may appear in VR systems, based on what is being tracked:
1. The user’s sense organs:
If a display is attached to a sense organ, and it should be perceived as in VR as being attached to the surrounding world, then the position and orientation of the organ needs to be tracked. The inverse of the tracked transformation is applied to the stimulus to correctly “undo” these DOFs. Most of the focus is on head tracking, which is sufficient for visual and aural components of VR; however, the visual system may further require eye tracking if the rendering and display technology requires compensating for the eye movements.
DOF = Degrees of Freedom
2. The user’s other body parts:
If the user would like to see a compelling representation of his body in the virtual world, then its motion should be tracked so that it can be reproduced in the matched zone. Perhaps facial expressions or hand gestures are needed for interaction. Although perfect matching is ideal for tracking sense organs, it is not required for tracking other body parts. Small movements in the real world could convert into larger virtual world motions so that the user exerts less energy. In the limiting case, the user could simply press a button to change the body configuration. For example, she might grasp an object in her virtual hand by a single click.
3. The rest of the environment:
In the real world that surrounds the user, physical objects may be tracked. For objects that exist in the physical world but not the virtual world, the system might alert the user to their presence for safety reasons. Imagine that the user is about to hit a wall, or trip over a toddler. In some VR applications, the tracked physical objects may be matched in VR so that the user receives touch feedback while interacting with them. In other applications, such as telepresence, a large part of the physical world could be “brought into” the virtual world through live capture.
There are also two tracking variants to determine your position in space:
External VR tracking:
With external tracking, in addition to VR glasses and a PC, additional devices are set up that “observe” movements in the tracking area. The base stations of the Valve Index can be mounted on the wall or on stands, for example. This system is known as SteamVR tracking (formerly Lighthouse) and was developed by Steam operator Valve.
Base stations send a dense infrared laser beam network into the room at intervals of milliseconds. These class 1 lasers are safe for humans. The infrared rays hit photo resistors on the VR glasses and the motion controllers. Based on the time difference between the laser beams hitting the sensors, the computer determines the exact position of the VR glasses and VR controller, as well as the movements.
The biggest disadvantage of this tracking system is the relatively complex installation: You have to attach the boxes to certain points in the room and supply them with power individually.
Internal VR tracking:
With this variant, the tracking is built directly into the VR glasses. External devices are not necessary. With Oculus Quest 2, for example, tracking takes place via cameras in the VR glasses, which track the position and movements of the user relative to the environment.
The greatest advantages of internal tracking are ease of use and high mobility: Oculus Quest 2, for example, can be used in any room and — if the lighting conditions are right — even outside.
On the other hand, depending on the system, tracking interruptions can occur if the controller is moved outside the camera’s field of view for example behind the back or if it is very light or too dark.