Meta Connect 2024: Orion — the Smart Glasses We’ve Been (And Still Are) Waiting For

At its annual developer conference, Meta launched an inexpensive Quest headset, provided updates on the Ray-Ban Meta Smart Glasses, and promised to make life better for developers with new tools and APIs. All anyone will remember from this event is Orion, Meta’s smart glasses of the future. That’s fine – Orion is wildly impressive – but the technology needs software, hardware refinement, and Meta admits that it is effectively impossible to manufacture at acceptable consumer price points. Nonetheless, Meta’s Orion is basically what technologists imagine when they predict we’ll be wearing smart glasses in the future.

Quest 3S

Meta’s Quest 2 and 3 are the best-selling VR headsets in the world. The Quest is arguably the only mainstream VR headset platform, especially with Sony seemingly losing faith in the PSVR2, and Apple iterating slowly on its $3500+ Apple Vision Pro. Apple will undoubtedly improve Apple Vision Pro’s value proposition over time, and it sets a high bar for an AR-via-passthrough experience today. However, in addition to its high price, it suffers from challenging ergonomics (it literally hurts to use for long periods of time) and a dearth of platform-specific software. You would not want to use an Apple Vision Pro to do vigorous exercise in Supernatural or Beat Saber even if you could. (You can’t. Meta owns both and will not be porting them to VisionOS any time soon.)

Meta Quest 3 (rear), Meta Quest 3S (foreground)

Apple’s entry has spurred Meta to improve the Quest’s passthrough AR, multi-window environment, and hand-tracking. At Connect, Meta announced that Dolby Atmos is being added to the platform for immersive audio, and Microsoft Remote Desktop is coming to improve PC productivity. Photorealistic room scans are another tool that should prove useful for interior designers and content creators. Admitting that it has been difficult to work with, Meta is now expanding developer tools for its platforms ("if you can build an Android app you can write for Quest") and enabling camera passthrough access for developers early next year. This should allow scene understanding and AI for Quest.

Meta is also moving to further entrench Quest as a platform for mixed reality exercise and gaming with a lower priced $300 Quest 3S model that has the same Qualcomm processing power, same controllers, and runs the same software as the Quest 3, only with lower resolution, lesser optics, and smaller storage options. The Quest 3 will remain in the line at $500, but for gaming and exercise, the Quest 3S experience should be good enough for many.

Strategically, prioritizing the Quest 3S over a Quest 4 is probably a mistake, though not an unrecoverable one. Apple will be bringing the price of Vision Pro down over time, and new entrants from Google, Samsung, and Lenovo – also powered by Qualcomm – will be chasing that market as well. It would be better for Meta to focus on improving the Quest hardware for richer entertainment experiences and productivity at the $500 - $600 price point than doubling down on virtual game consoles at $300. The good news for Meta is that the 3S should grow the Quest installed base, which will make developers happy. Meta also has time. Being too early on AR and VR has burned investors in the past badly enough that the competition isn’t moving with the same urgency to build up spatial computing as they are currently scrambling to establish footholds in generative AI.

Ray-Ban Meta Smart Glasses

While Meta isn’t providing sales numbers, the Ray-Ban Meta Smart Glasses have sold better than either Essilor-Luxottica (Ray-Ban’s parent) or Meta anticipated, and the companies have had trouble keeping them in stock. Meta now says that this has been addressed. It is not making any changes to the Qualcomm-powered hardware, but improved Transitions lenses are now an option. My review unit was customized by Essilor-Luxottica with prescription lenses and older (slower) Transitions formulation, so this is a welcome improvement.

The Ray-Ban Meta Smart Glasses are so successful because they found the perfect balance of “smart” and “glasses” with an emphasis on the latter. First and foremost, they are genuine Ray-Ban Wayfarers, which are an iconic design with no compromise in style or comfort. As Ray-Bans, they can be purchased as sunglasses or customized with prescription lenses at any of Essilor-Luxottica’s extensive distribution points (ex: Lenscrafters, Sunglasses Hut).

The technology included is relatively simple and works extremely well. Qualcomm provides the silicon, which powers cameras for capturing first-person perspective images and video, in-line speakers, and exceptionally good microphones helped by their location in the nose bridge (i.e., fairly close to your mouth). They can be used as open-ear Bluetooth headphones, and with Meta AI, they act as a voice-driven AI assistant that’s always with you.

At Connect, Meta announced a new transparent frame option (I borrowed fellow analyst Anshel Sag’s pair for a quick photoshoot), along with new software features. Live translation is coming soon; you’ll be able to use the camera as an input for more generative AI feedback for things like creating recipes from ingredients on hand, dialing phone numbers, reading QR codes, and helping you remember where you parked. Meta didn’t just add practical software features; the conversational voice UI will soon be able to talk to you using celebrity voices including John Cena and Awkwafina.

These features aren’t entirely new or exclusive to Meta. We’ve had celebrity voice options for GPS navigation systems for years, Amazon demo’d using your camera to capture information back in the Fire Phone days, and live translation is available on phones from Samsung, Google, and more. There’s still a delay in Meta’s live translation, and for two-way conversation, both parties need to be wearing the glasses. Still, this can be more natural than using your phone as an intermediary, and the this is going to be fantastic for accessibility: it won’t be long before AI can guide the vision impaired without having another human in the loop.

Orion: The Smart Glasses We’ve Been (And Still Are) Waiting For

While Ray-Ban Meta Smart Glasses are a little bit smart, they don’t have displays and aren’t the smart glasses technologists dream of. We’re edging closer to the AI assistant that lives in Tony Stark’s glasses, but the active display in his field of view has remained out of reach.

Context

Microsoft and Magic Leap had the first commercial AR glasses, though neither were able to get the technology down to fit in actual glasses. After a couple of generations of failed leadership and an inability to significantly improve the field of view, Microsoft has largely abandoned HoloLens to become a software and services supplier to every other mixed and virtual reality platform. After wildly overpromising and underdelivering on a consumer product, Magic Leap pivoted to the enterprise and then turned to licensing its IP.

After building its own prototypes in its labs, Apple determined that the technology for see-through glasses with holographic overlays just won’t be ready any time soon. Apple instead focused on reaching augmented reality via high resolution VR displays and sophisticated camera pass-through with the Apple Vision Pro. This is a viable approach – especially as a way to establish a development platform that can be used for future lighter, cheaper headsets or glasses, but Apple Vision Pro is not something you could wear all day and use as an interface to the real world.

Xreal’s Air Pro 2 – which I took with me on this west coast trip – is an actual pair of glasses, though one that needs to be physically tethered to an Xreal Beam Pro or other processing unit. The Xreal Air Pro 2 has a bright, full color display, but the field of view, processing power, user interface, apps, and battery life are all lacking. That said, it is a good way to catch up on content in private on transcon flights, and Xreal has shown me future products under NDA that are impressive.

Orion: Wow

At Connect, Meta showed off Orion, its long-gestating attempt at creating fully capable holographic smart glasses (glasses with displays). Mark Zuckerberg called them, "the most advanced glasses in the world,” and he’s probably not wrong. The amount of technical innovation in Orion is staggering. Orion is a thick pair of glasses; the frame is made of magnesium that weighs just 98g and contains micro projectors and wave guides that can display full color holograms across a fairly wide field of view (72 degrees). Cameras around the unit can map your environment and provide hand tracking. In addition to hand tracking, Orion can be controlled by voice or a wrist-based neural interface: gestures that are picked up via electromyography on a wrist band that comes with the system. Orion has no tether and Meta claims that its internal batteries will last a whole day’s use by offloading most of its processing to a smoothly rectangular puck that fits in your pocket and connects wirelessly to the glasses.

What can you do with Orion? Meta allowed select journalists and analysts to spend up to an hour with the glasses and the experiences included watching videos, scrolling through Instagram, making video calls, creating recipes, and playing a game of virtual pong. Meta’s Llama AI provides a virtual assistant on your face. The promise of smart glasses like these, though, goes much further, including overlaying contextual information on the world around you. This could include operating instructions “on” your appliances, details about the person you are talking to and a history of your interactions, or even the glasses knowing that you are talking to your spouse and should not be interrupted by social media notifications.

Crucially, Orion is a prototype. I was told by more than one source that the cost to manufacture Orion exceeds $10,000, and only 1,000 have been built for use inside Meta, media demos, and distribution to a handful of early software development partners.

Initial thoughts:

  • Meta acknowledges that it needs to improve the manufacturing process to get costs down. It also needs to work with Essilor-Luxottica to style the frames, and there is a long list of areas for Meta’s hardware team to finalize before Orion is consumer-ready. Zuckerberg says that, “we have a line on each of these items." In other words, Meta believes that it can get there in time, but no launch timeline or price range for a commercial product is being provided. If Meta thought that there was even a chance of making holiday 2025 sales, it almost certainly would have said that Orion is “coming next year.” That means it may be 2027 or even 2029 before this is a consumer product, and even then the likely price point is going to be high.

  • Hologram resolution, contrast, and frame rate are all reported limitations in the current version. However, Orion is see-through – it doesn’t need to recreate the real world with perfect fidelity, just to augment it. You won’t want to watch a movie in Orion, but it should still be a fantastic way to get notifications in your field of view, have light gaming experiences, or have the mapped-to-the-real-world walkthroughs that get enterprise AR developers excited.

  • A 72-degree field of view is not your entire range of vision, but much larger than the 40 - 50 degrees on most existing smart display glasses. The limited field of view on Microsoft’s HoloLens ruined the illusion of seeing digital objects in the real world because only part of that digital object was viewable.

  • Does Orion work outdoors? All of the demos I’ve seen discussed were in controlled, indoor environments. Cellular and sunlight could be additional areas for development.

  • The Orion glasses weigh just 98g, which is a technical achievement. However, that is still nearly twice as heavy as the 50g Ray-Ban Meta Smart Glasses. Just as a point of reference, Google Glass Enterprise Edition 2 weighs 51g with a frame.

  • Orion’s neural gesture system reportedly works well. I’ve been testing a neural control system for iOS from Wearable Devices, though it uses Surface Nerve Conductance (SNC) which detects intent directly, rather than electromyography (EMG), which detects muscle-generated electrical signals. Practically speaking, Meta’s system works, provided you do the motions with your hand. Wearable Devices is almost certainly the better technology for accessibility.

  • Meta does not seem to have solved the problem of vision impairment at this stage. Orion demos required good vision or contact lenses (CNET’s Scott Stein had to wear Mark Zuckerberg’s contacts!). I suspect that Meta’s choice to go with silicon carbide lenses to widen the field of view is making it harder to create vision correction in those lenses. Of course, it is also possible that Meta has a solution for prescription usage but chose not to implement it on its limited production run of prototype glasses.

  • Finally, it appears that Meta is designing the silicon for Orion, not its usual XR supplier Qualcomm. Of course, it is possible that since this is just a prototype, the eventual silicon provider could change.

Conclusion

Realizing the full potential of Orion will require both technical, manufacturing, and software improvements. It may also require a cultural and business model overhaul at Meta. Meta’s developer community is not nearly as robust as Microsoft, Apple, or Google. Consumers may be reluctant to trust the company as well. Is Meta – a company that makes money on contextual advertising and has had notable privacy failures – the best company to bring Orion to market? While I personally would love to see the name and a summary of my last conversation of people overlaid in my field of view at social functions, if Meta actually showed off this use case today it would raise huge privacy flags.

Given how far Orion still is from shipping as a consumer product, why is Meta announcing it now? It’s too early for developers to see a return on investment. It might help get investors excited about the stock, though Facebook becoming Meta has always been a long-term project with platform ambitions to keep Meta's ad business unfettered by Apple or Google.

In the end, Meta is showing it off now because it can -- Orion has reached a stage of development where it can be demo'd -- and Zuckerberg wants to. Google also likes to show previews of tech -- sometimes entirely too early. Apple keeps these types of projects secret until they are ready for developers. Snap just showed off its latest Spectacles, which might make it to market sooner than Orion, but without many of Orion's bleeding edge capabilities. Orion definitely positions Meta at the front of a short list of companies who can master the different optics, materials, design, manufacturing, compute, input, interface, and software elements that all must be assembled to make something like this work. But until Meta provides a timeframe and price range, this is just a proof of concept.

To discuss the implications of this report on your business, product, or investment strategies, contact Techsponential at avi@techsponential.com.


Updated 9/29/24 with additional content in “Initial Thoughts” and “Conclusion” that didn’t make it into the original report due to impossibly slow Wi-Fi on the United flight I was on.