Google has released Android XR SDK Developer Preview 3, introducing a wide range of tools aimed at helping developers build both immersive headset experiences and a new generation of lightweight AI glasses. The update comes as manufacturers prepare to roll out additional Android XR–powered devices, including Samsung’s Galaxy XR and upcoming models from XREAL, Gentle Monster, and Warby Parker.
For the first time, developers can build augmented experiences specifically for AI glasses running Android XR. The devices are designed for all-day wear and include onboard speakers, cameras, microphones, and small display surfaces intended for private, glanceable information.
To support these new use cases, Google is introducing two Jetpack libraries built for glasses-based interaction:
- Jetpack Projected – provides access to sensors, speakers, cameras, and displays on AI glasses, enabling mobile apps to extend their functionality to wearable form factors.
- Jetpack Compose Glimmer – a new UI design system optimized for optical see-through displays, emphasizing clarity, legibility, and minimal visual distraction.
A dedicated AI Glasses emulator in Android Studio is also launching, allowing developers to test touchpad input, voice interactions, and glasses-specific UI layouts.
Developer Preview 3 expands ARCore for Jetpack XR to support motion tracking and geospatial capabilities on AI glasses. These features enable hands-free navigation and environment-aware experiences tailored for devices meant to be worn throughout the day.
Alongside glasses support, Google is updating the existing XR stack for headsets such as the Galaxy XR and wired glasses like XREAL’s Project Aura.
Key additions include:
- SceneCore updates, including dynamic glTF model loading via URIs, improved materials, and Widevine DRM support for 360° and 180° immersive video playback.
- Compose for XR enhancements, such as persistent follow behavior through the new UserSubspace component, spatial animations for smoother movement, and layout sizing based on a user’s comfortable field of view.
- Material Design for XR updates, including spatially adaptive dialogs, navigation bars that expand into orbiting elements, and a SpaceToggleButton for switching between full-space and constrained modes.
- ARCore perception improvements, introducing face tracking with 68 blendshape values, eye-tracking for avatar control, and depth maps for more realistic interactions.
Google is also rolling out an XR Glasses emulator tailored to XREAL’s Project Aura, matching field of view, resolution, and DPI to help developers visualize content more accurately during development.
For Unity developers, the Android XR SDK adds new experimental tracking options—including QR codes, ArUco markers, planar image tracking, body tracking—and a long-requested scene meshing capability that allows digital objects to collide with real-world surfaces.
Android XR SDK Developer Preview 3 is available nw through the latest Android Studio Canary release, along with updated emulators and sample libraries. Google says it will continue working with the developer community as more XR headsets and AI glasses arrive on the market.


Comments
Loading…