Snapdragon Spaces Brings Lightweight Headworn Augmented Reality to Mobile Apps
September 19, 2023
Sponsored Story
Smartphones evolve on a regular cadence, and new features come out with every new generation. While some new features fall into the category of revolutionary, such additions are rare.
Lightweight headworn AR glasses powered by smartphones definitely fit the revolutionary bill, and they are on the way. It’s no secret that they offer an extremely immersive user experience, and if you’ve ever tried on a pair, you’d certainly agree. A key aspect of AR glasses, as opposed to VR glasses, is that they let users look at their actual surroundings while retaining that digital (augmented) view, rather immersing them in a completely virtual experience.
Start With a Proven Platform
To assist anyone developing apps for lightweight AR glasses, Qualcomm Technologies, Inc. offers the Snapdragon Spaces™ XR Developer Platform. According to the company, Snapdragon Spaces is an end-to-end platform leveraging OpenXR with a rich SDK and a range of supported XR devices for building Android-based XR experiences.
Snapdragon Spaces offers a devkit consisting of the Lenovo ThinkReality A3 smartglasses paired with Motorola edge+ smartphone.
The Snapdragon Spaces SDK is free and available for the popular Unity and Unreal gaming engines. Those engines, along with the developer platform, work hand-in-hand with the Snapdragon Spaces Services APK, which provides the AR runtime on Android devices.
These tools, found in the developer platform, combine content-driven workflow with developer-oriented programming capabilities. They are also based on OpenXR, which ensures that API calls will work across compatible devices with minimal porting. Using the Unity and Unreal OpenXR plugins, developers can build experiences that target OpenXR-compliant devices, while hardware manufacturers can build compatible devices with capabilities that fulfill the OpenXR spec.
Of course, the hardware and platform are only part of the equation. To build truly great, immersive AR experiences, developers need to be able to implement common AR features and functionality, and that’s where Snapdragon Spaces shines.
Positional Tracking with 6DoF
Most developers will agree that positional tracking is the most important feature for an AR application, as it estimates the position and orientation of the user’s viewing device within a 3D space and plays a key role in mapping the user’s environment. The information is needed to track the end-user’s position relative to the world and render AR content in the scene relative to the end-user’s head position and orientation. While this sounds like a complex feature, and it is, Snapdragon Spaces does most of the work for you by capturing the environment as 6DoF information coming from the head-worn AR device.
Cast a Ray
Scene understanding and spatial mapping involves several approaches such as Plane Detection, which lets the app identify flat surface regions, such as walls, tabletops, and other flat objects. Developers can also employ image recognition and tracking functionality to identify visual features or markers captured by a camera mounted to the glasses. And ray casting (aka ray tracing), or sensing how rays intersect objects, can identify interactions with objects within the scanned area.
Lending a Hand
Next on the developer’s AR design flow chart is Hand Tracking, which, in some scenarios, is used to manipulate and interact with digital objects. To allow users to see virtual representations of their arms and hands that update in real time, computer vision models operate on data from tracking cameras.
Dropping Anchor
Local Anchors offer the ability to lock or pin digital assets in space and associate them with real-world clusters of recognized geometry. Stored as metadata, the local version of the anchor is restricted to on-device local storage within the instance of an app, and by default, will be cleared as soon as you close your app. This is done both to save valuable memory resources, but also to maintain the real-time response that’s required.
Developers can save anchor information, allowing users to come back to the place they left off. This feature creates the illusion that objects remain in the environment, even when the user leaves. Such anchors depend on accurate positional tracking to maintain the digital objects’ position and orientation.
Conclusion
Headworn AR glasses powered by smartphones are poised to revolutionize the mobile smartphone experience. Now, developers can effectively layer these immersive experiences onto their mobile apps using the Snapdragon Spaces XR Developer Platform.
For more information and examples of the functionality discussed here, check out the Snapdragon Spaces developer documentation.