Using Algorithms to Achieve Frame Alignment in AR/VR
September 04, 2019
Blog
Augmented reality and virtual reality (AR/VR) systems are tasked with creating virtual yet realistic environments to be interacted with, whether that?s through a smartphone screen or a 3D headset.
Augmented reality and virtual reality (AR/VR) systems are tasked with creating virtual yet realistic environments to be interacted with, whether that’s through a smartphone screen or a 3D headset. How the user experiences movement within these environments also needs to mirror their movements in the physical world as closely as possible.
Degrees of Freedom
Before we get into how to mirror real life in VR, we’d like to take a quick moment to review a couple of core concepts. Degrees of Freedom (DoF) relate to the number of possible directions that an object can move in a 3D space. An IMU (inertial measurement unit) can provide between 2 and 6 DoF. Three DoF, commonly realized as a 360-degree video about a fixed point, is a mid-range between a flat, 2D image and a fully-immersive 3D experience (6 DoF). In 3 DoF, the viewer can rotate to look around them in the space, but cannot move in the space.
For a true-to-life user experience, AR/VR systems need to achieve 6 DoF, which layers on linear positioning, freeing viewers from their fixed location. This gives users the ability to move backwards and forwards, up and down, and left and right in 3D space, to provide a more natural and realistic experience, which is the goal for equipment manufacturers.
Integrating Handheld Controllers in AR/VR
Handheld remote controls are often used in AR/VR games or systems as a way to control a virtual element, such as a sword or a steering wheel. They are also used to represent the user’s hand within the virtual setting.
Achieving 6 DoF with a handheld motion controller requires capturing the remote’s linear and angular position, in real time. The 3D angular position is generated from the remote’s orientation and angular movement along the yaw, pitch and roll axes. The 3D linear position refers to the remote’s specific location in 3D space.
To track 3D angular position, an inertial measurement unit (IMU) is commonly used. The IMU can be paired with a fixed camera that determines the linear position of the remote. Because two separate components are being used to track position, each frame of measurement produced by the IMU needs to be aligned with the corresponding frame of measurement produced by the camera in real time. If the frames are not properly aligned, data come back with a high latency, or if the sensors’ outputs are not given at a high enough rate, it can result in a visual lag, adversely affecting user experience.
Maintaining Alignment
Maintaining alignment can be an expensive proposition. One common method is point detection, which relies on identifying key markers to use as reference points. In these methods, an external camera is used to identify points in a predetermined array to track an object and maintain alignment. This method generally requires multiple high-quality IR (infrared) cameras to track the array of points on the headset and controllers. This method of using external cameras to determine the position and orientation of the devices is referred to as “outside-in.” Using a camera on the headset to look outwardly at the controllers and world is known as “inside-out,” alternatively.
There is a cheaper alternative to align controllers in the virtual space, which uses the linear position of a visible spectrum camera to deliver a 6 DoF experience. You don’t need expensive cameras or complex arrays of points for linear positioning; you can instead use a simpler, single sphere to determine linear position (you might have seen the ping-pong-ball-esque orbs on certain VR handheld motion controllers that are relying on this inside-out method).
From there, you can use sensor fusion algorithms to get the frame alignment right. Using a simple visible spectrum camera and some creative math, a full 6 DoF experience can be achieved at a lower overall cost. Read more about our methodology in our technical paper, “Camera and IMU Frame Alignment in AR/VR Systems for Different Gyroscope Specifications.”