AWE 2025 Recap — Is XR Going Mainstream Yet?

Posted by Anshel Sag, Contributor | 5 hours ago | /cloud, /innovation, Cloud, enterprise&cloud, Innovation, standard | Views: 7


Augmented World Expo 2025 had a much more pointed theme than previous years, with a strong focus on XR going mainstream and a message that the industry is finally coming out of its “startup” phase. As always, the world’s premier XR conference was kicked off by its organizer, Ori Inbar, who gave his usual summary of the past year and his expectations for the year to come. Inbar started his presentation by playing pool using the Spectacles developer AR headset from Snap, which had a huge presence at AWE this year.

While attendance numbers seemed to be similar to last year’s, I would say that this year the venue felt fuller, with fewer empty spaces. That said, others I talked with felt like this year’s event was smaller. This might have been due in part to Snap’s presence being outside the expo hall. If anything, I think that this year’s more spread-out show indicated more maturity and growth.

Qualcomm’s XR Stewardship And New Chip

As the leading silicon platform provider in XR, Qualcomm has long been involved in AWE. This year Qualcomm was a Diamond sponsor and gave the opening vendor keynote on day one of the show. Ziad Asghar, who is the general manager of Qualcomm’s XR division, kicked off his “Accelerating the Spatial Computing Revolution” keynote with a summary of the market as Qualcomm sees it. He also went through the company’s partner launches in the year since AWE 2024, showing the company’s depth and breadth in XR.

Asghar took over Qualcomm’s XR division about a year ago after its previous leader, Hugo Swart, went to Google. Asghar’s experience prior to leading XR was helping Qualcomm with its AI strategy; with AI being inextricably linked with XR, I think he’s a great person to lead the division. He spoke highly of the company’s customers — the XR headset OEMs — and gave special attention to Meta Ray Bans and how the partnership with Meta has enabled such a successful product. This included a walk-on from Bruno Cedon, senior director and head of AR architecture at Meta.

Asghar also spoke to the many different applications of XR, including a Capcom and KDDI partnership for the Osaka Fair using a Monster Hunter experience that I got to try and was quite impressed with. This leveraged a custom headset for entertainment purposes and teed up the second cohort of XR Sports Alliance members, which now include T-Mobile, Google, Lenovo, Red Bull and more. Following that, surprisingly, Google’s Swart joined Asghar on stage to talk about the companies’ partnership with Samsung on Android XR and Project Moohan. Asghar then talked about Qualcomm’s Project Aura headset in partnership with XREAL; this allowed him to discuss the new Snapdragon Spaces Compatibility Plugin for Android XR, which is designed to smooth out the transition to Android XR by making it much easier to port Snapdragon Spaces apps and experiences into it.

Building on its success in smart glasses for Meta and others, Qualcomm announced the new AR1+ Gen 1 processor for AI glasses. As a physically smaller chip, it reduces the temple height necessary for glasses by 20% while also reducing power consumption by 7% over the AR1 Gen 1. It also enhances image quality and stabilization, while running on-device AI models up to 1B, which Asghar demonstrated live on stage.

To complement these capabilities, Qualcomm also announced a reference design for a wearable smart ring controller developed with Kiwear. I believe that smart rings and other wearables will become complementary technologies for AR or AI smart glasses because they give an alternative to voice input and can be used for biometric authentication and for small motor controls. I got an opportunity to try out the Kiwear ring with TCL RayNeo X3 AR glasses and was quite impressed. The ring can double as a controller for a tablet game, so there is more than one application for it. The Kiwear reference design also includes a capacitive touchpad that allows the user to perform more discrete inputs without having to wave their hands around.

Snap Leads The Charge In XR

Snap’s presence at AWE 2025 was hard to miss, and the company came out swinging with a series of major announcements. The first and arguably most important was that Snap will ship its new consumer Specs in 2026, which CEO Evan Spiegel said will be lighter, more compact and more powerful than the current Spectacles developer glasses. While Snap didn’t give any further details on timing or pricing, this launch could potentially put Snap back at the forefront of the AR race along with Apple and Meta. I believe that Snap has long been ahead of the curve in terms of AR design and understanding — after all, Snap paved the way for the entire smart glasses market with the first three generations of Spectacles.

Besides making the Spectacles hardware and operating system more capable for consumer applications, Snap has already created deep integrations with OpenAI and Gemini to enable AI-powered lenses. Lenses are Snap’s way of delivering applications to users through its glasses or smartphone apps. Snap also announced a Depth Module API, Automated Speech Recognition API and Snap3D API, all for enhancing developer lenses with AI. Snap is also taking into consideration larger deployments of AR glasses for entertainment and educational purposes with fleet management, guided mode and guided navigation.

Snap also announced that it would be working with Niantic Spatial to bring its Visual Positioning System to the Lens Studio, enabling developers to have more accurate location information and develop better world-scale AR applications. Niantic Spatial owns the core positioning technology for AI and XR applications that stayed with Niantic after it sold off its gaming division. (More on Niantic below.)

XREAL Shows Real Momentum

Ralph Jodice, newly minted general manager of XREAL’s North American unit, talked about XREAL’s momentum with its current product lineup. Jodice detailed the company’s efforts in delivering wider-field-of-view displays and transitioning from “birdbath” optics towards its new X-prism design in the One Pro AR glasses; these have 44% thinner optics than the XREAL One while delivering a 15% wider FoV with no reflections. He detailed the company’s 10,000-plus pre-orders for the One Pro that have already shipped, with many more on the way. Keep in mind that the original launch date was supposed to be March 2025, when the company unveiled the One and One Pro at CES 2025. The company has been very apologetic and expressed its appreciation for customers’ patience, though it also did not explain the delays. The One Pro became available in retail stores like Amazon and Best Buy starting July 1, 2025 with a retail price of $649 — which is a bit on the steep side, but I believe worth it considering its improvements compared to the $499 XREAL One. In addition to the glasses, XREAL discussed the new Eye accessory, which is a very small optional camera for photos, videos and 6-DoF tracking to enable better spatial tracking. XREAL also announced the XREAL Neo power bank, which will enable people to use XREAL glasses with handheld gaming devices without draining their batteries — and may be a workaround for letting the glasses work with the Switch 2.

here was also plenty of attention to XREAL’s new Project Aura headset, co-developed with Google using Android XR. Google and XREAL didn’t give many details about Project Aura at Google I/O, which I wrote about last month, so many people expected more details at AWE 2025. We already know that Project Aura’s optics will enable a much wider 70-degree FoV, which is a significant upgrade from the XREAL One (already shipping with 50-degree FoV) and even the brand-new XREAL One Pro (57 degrees). Project Aura will also upgrade the onboard X1 chip — designed by XREAL itself — to an X1S, which stands to be 25% faster and enable better multitasking. This is not to be confused with the Snapdragon-based compute puck that will drive the AR experiences and render all the graphics.

Project Aura is slated to launch in 2026, with some developers getting early access to devices at the end of this year. As far as I know, there were no demos or even prototypes of the XREAL Project Aura anywhere at the show. XREAL CEO Chi Xu did say that the company will have “more to share later this year,” but didn’t give details about what the company would disclose.

XREAL is also expanding its efforts in the enterprise space with headsets across the range of its offerings, including the XREAL Air2, Air2 Pro, One and One Pro as well as the XREAL Eye, Beam Pro and Air 2 Ultra. The company recently hired Amir Khorram, who had led enterprise XR at ByteDance and HTC before that. XREAL’s enterprise push makes sense because it is already working with companies like Sightful to enable enterprise applications for spatial computing to improve productivity. I got to try Sightful’s Spacetop AR-powered workspace with an Intel Lunar Lake-powered laptop from HP paired with a set of XREAL’s older Air2 Ultra glasses, all running on Windows, and it was very similar to the experience I had last year on Sightful’s own hardware. Even with the limited time I had, I could definitely see something like this working well for business travelers to enhance productivity and security — not to mention entertainment applications. XREAL Enterprise is also collaborating with Qualcomm, Nvidia and AWS to enable enterprise customers.

Niantic Spatial Launches As A New(ish) Company

As mentioned earlier, Niantic Spatial is the geospatial intelligence part of Niantic that didn’t get sold off, and I would argue it is the more valuable part of the company long-term. Its CTO, Brian McClendon, spoke on stage about its efforts to enable people and machines to understand and interact with the real world. The company’s platform incorporates many different components brought in via acquisitions such as Scaniverse and 8thWall to enable some of the highest-fidelity AR experiences.

McClendon demonstrated that on stage with a demo of the Ferry Building in San Francisco and a scan of Fisherman’s Wharf that had one of the most photorealistic digital twins I’ve ever seen. The company’s Visual Positioning Service can be used for both public and private applications, empowering more developers for more end uses. I believe that emphasizing the breadth of possibilities is why we saw Niantic Spatial announce a partnership to enhance the immersion at Meow Wolf installations and enable more virtual art and experiences. In addition to that partnership, I was pleasantly surprised to see Niantic promoting support for WebXR in the Snap browser, which is something we also saw from Apple at WWDC. I believe this is a net positive for cross-platform experiences and enabling XR at scale for mainstream users. I believe that the deeper partnership between Snap and Niantic will benefit both companies greatly as they compete with the likes of Apple, Google and Meta.

Niantic also talked about some of its cutting-edge research in Gaussian splats with the use of MVSAnywhere. (MVS stands for “multi-view stereo.”) This enhances the depth estimation in a Gaussian splat, which creates a more realistic scene while also pairing with VPS to enhance location accuracy and speeding up 3-D content creation. Gaussian splats are quickly becoming the default for 3-D content creation and capture and are likely the core technology behind Apple’s new Spatial Scenes.

Niantic flexed its spatial capabilities by showing off its Large Geospatial Model, which it claims has 50 million networks trained, 150 trillion network parameters and 1.4 million VPS-activated locations. According to Niantic, this model combines LLMs with 2-D vision models and 3-D vision models to create 3-D maps that incorporate all three. I believe that Niantic’s model will only get bigger and more capable as more people use it, including companies like Snap — which, as discussed earlier, Niantic has already partnered with. I expect that we’ll see even more companies partner with Niantic Spatial on its Large Geospatial Model, especially since outside of Google and Apple many companies lack the positioning data necessary to make AR viable. Long term, AR needs to be highly location-aware to be useful to developers and users; the more accurate the locations are, the better the context for augmented experiences. Indeed, I have long believed — and will continue to believe — that spatial awareness is the bedrock of AR. Note, for example, that I published “Context is Everything When It Comes to AR” more than seven years ago.

Google’s Android XR Needs More Developers

When it came time for Google to speak directly to developers about Android XR, Juston Payne and Hugo Swart each took their time to talk about Project Moohan and the partnership with Qualcomm and Samsung. They also brought on former Qualcomm product executive Said Bakadir (now at Meta) to talk about transitioning Snapdragon Spaces to Android XR. Google also referred to Project Aura with XREAL and implied that it would be a reference device of sorts and only the first of many AR devices for Android XR.

Google also talked about its embrace of the Unity game engine, the OpenXR standard and the existing Android ecosystem to empower developers to more quickly build or port their apps to Android XR. I think that we’re still quite a way away from seeing Android XR in wide release, given that Moohan is the first headset to use it, yet that device may not arrive until late Q3 or early Q4 this year, based on current rumors. I fear that if Google does not get things moving quickly, it may miss the opportunity it currently has to attract disenchanted Apple or Meta developers.

Palmer Luckey Talks More About Anduril’s Eagle Eye

Last year, XR pioneer Palmer Luckey shared that he was developing a headset for the military. Since then, his company, Anduril, has taken over Microsoft’s IVAS contract, which totaled $22 billion over 10 years. (I wrote about that earlier this year.) Anduril has since renamed it the Soldier-Borne Mission Command project and is building a platform called Eagle Eye to address SBMC in conjunction with its Lattice software solution. Luckey went into more detail about Eagle Eye supporting different types of headsets that military users need based on their roles.

According to Luckey, Eagle Eye is a partnership among Anduril, Meta, Oakley and Qualcomm. He also said that there’s a considerable amount of local compute powered by Qualcomm, which might imply a processor more powerful than the current XR2+ Gen 2, because these headsets need to run a lot of AI locally. Luckey added that at least one of the Eagle Eye headsets will have resolution several times higher than the Vision Pro. This seems a bit hyperbolic to me but could be possible if the headset ships a year or two from now. I believe that Anduril has the right resources and people working on this project, especially with Meta’s engineering and IP involved.

More Developments At AWE 2025

At AWE I got to experience the next generation of location-based entertainment with Play for Dream’s own first-party mixed reality headset. The mixed reality experiences were simple but quite good, and I think the company has something there. TCL’s RayNeo X3 was all over the show floor, with demos at Qualcomm’s booth and others, using Applied Materials’ etched waveguide technology. Speaking of waveguides, I also saw the updated Z-30 and Z-30 optical engines from Lumus, which also launched at the show and are rumored to be in some smart glasses coming later this year. I also got to see Maradin’s updated waveguide technology, and it has improved significantly since I last saw it at MIT RealityHack in January. Best I can tell, the team at Maradin really took my feedback to heart. Another waveguide supplier, LetinAR, showed me its improved and optimized solutions that look clearer than before and are better optimized for manufacturability and yield. LetinAR’s biggest implementation is currently the MiRZA from NTT DoCoMo’s QONOQ hardware division, which I tried last year at AWE.

While Meta’s presence at AWE 2025 was lighter than previous years, there were still plenty of people and products using Meta, and smart glasses were still a major focus — which Meta does deserve credit for. It seemed like almost every other person was using a pair of Meta Ray Bans, which has been one of the most mainstream products in XR to date. The company also just launched the new Meta Oakley glasses with improved 3K video and longer battery life.

Not Quite ‘Mainstream’ Yet — But Closer Than Ever

While I don’t think the XR market has quite reached the mainstream, I do think that we are probably the closest we’ve ever been in my career — and that 2026 will likely be the year that we can accurately start claiming that XR is mainstream. Most people that I’ve talked to this year have noticed a considerable uptick in activity across the industry outside of AWE, after it simply didn’t feel as busy over the last year or two. Lots of companies hit delays and pushed back launches during that period, and I believe that we may see a “big” year in 2026 considering that the likes of Snap, Meta and XREAL are launching headsets and glasses for consumers.

Even after these companies launch products in 2026, there is still a long road ahead to ensure that developers can build content for these new platforms, especially aided by companies such as Google enabling multiple hardware vendors with its OS. Meta has already moved in that direction with Horizon OS, but it hasn’t launched any third-party products with Horizon OS yet. Things are getting very interesting in the XR space, especially with the growth of AI, and I think Meta’s success with the Meta Ray Bans has shown it has some legs.



Forbes

Leave a Reply

Your email address will not be published. Required fields are marked *