Meta Orion Through the Optics Pictures

Meeting at CES 2025

CES is just a few weeks away (January 7-10, 2025). If you or your company want to schedule a meeting with me at CES 2025, please email  meet@kgontech.com.

Introduction

Brad Lynch of the SadlyItsBradley YouTube channel let me know about Fast Company’s Inside Meta’s long-term vision to make its Orion glasses the Airpods of augmented reality article that had through the waveguide pictures of Meta Orion.

In addition to Brad’s YouTube channel, Brad and I have made several videos together, including our roundtable discussion of Meta Orion and Snap Spectacles (AR Roundtable Video Part 3, AR Roundtable Video Part 2, Snap Spectacles 5, and Meta Orion Roundtable Video Part 1).

Meta Orion Through the Optics Pictures

As I wrote in Meta Orion AR Glasses (Pt. 1 Waveguides), I was skeptical as to the image quality of Orion’s waveguide:

Diffraction gratings have a line spacing based on the wavelengths of light they are meant to diffract. Supporting full color with such a wide FOV in a single waveguide would typically cause issues with image quality, including light fall-off in some colors and contrast losses. Unfortunately, there are no “through the optics” pictures or even subjective evaluations by an independent expert as to the image quality of Orion.

Fast Company has the first and only pictures I have seen with a view through Meta Orion’s waveguides. The pictures, taken by Meta and provided to Fast Company, are not of high quality (they look like they were taken hand-held by a smartphone), but they do show the poor color uniformity provided by Orion’s waveguides. Below are the pictures as published by Fast Company:

For the next set of pictures, I have adjusted the contrast to show the color variation better.

Quoting from the Fast Company Article (with my bold emphasis:

Instead, its lenses encase the thinnest film of silicon carbide. Twice as refractive as glass, light doesn’t just bounce off the silicon, but actually flows through micro etched channels in the material to ultimately be viewable only to the wearer. It also means that wearing Orion gives the outside world the faintest iridescent glow.

This writing seems to suggest that there is also some amount of colorization of real-world view as well.

Meta embracing color issues – Hiding it in plain sight

The article cites Meta as trying to design the user interface to hide the color issues in plain site by designing icons with similar colors as caused by the waveguides variation.

Aerogel

This glow isn’t enough to transform someone’s face (IRL or on a call) into a confetti cake, but it is an aesthetic that the UX team leaned into across the entire interface. Or as Pujals puts it, “We embraced the boundaries.The app icons themselves use jewel-like color gradients. And its “Aero” UI was inspired by aerogel—the world’s lightest, semi-opaque material—with panels that shimmer with color. Think of Orion as the Miller Extra Light of hyper reality.

There are also the remarks (below) that Meta is experimenting with fonts to reduce readability issues cause by the waveguide.

In AR, subtle is trickier than overt. While Orion’s rainbow world is something of a punchdrunk buzz, a year ago, it was more like drunk goggles. Much of Pujals’s UX work has been technical in nature. Simply to make the screen legible, she’s worked alongside hardware and software engineers to render pixels properly in silicon carbide, smoothing out rough edges, eliminating strange aberrations. (The company is also working on a new version of its sans serif typeface, Optimistic, with fewer curves and deeper ink pools to be more legible in the product.)

The article also discusses the design and human factors trade-offs between resolution and FOV.

One of the biggest decisions Meta needs to make before shipping the product is around a technical tradeoff of its own screen: Will they prioritize field of view or resolution? 

Meta already has a version of Orion running with twice the resolution of the demo I tested. But much like a projector works in your home, the bigger the image, the fuzzier it gets. And Meta is still mulling just how to tune their technology to consumers, to balance image expansion with clarity. (Technically, they call this measurement “pixels per degree.”)

Comparison to Jade Bird Displays Compensation Demo with Diffractive Waveguides

The color variation with Orion appears to be worse, but the brightness variation appears to be less than in my recent study of the Jade Bird Display’s (JBD) compensation demo. The JBD demo only had a 30-degree FOV versus Orion’s reported 70-degree. With diffractive waveguides, it becomes harder to support uniformity with a wider FOV.

Below are high-resolution pictures I took through JBD’s diffractive waveguide correction demo, which was used in Jade Bird Display’s MicroLED Compensation. I took these pictures against a black background to give high contrast.

Conclusion and Memories of Hololens 2

The pictures of the view through Orion’s waveguides are about what I would have expected with a 70-degree FOV single (for all colors) diffractive waveguide; in other words, they’re not very good. This also explains why I was skeptical of the reports from people given access to Orion when they didn’t mention the image quality (and a problem I have had with reports on other AR/MR products, including the Apple Vision Pro). Often, the people given access, either due to a lack of understanding or the desire to keep access to the big companies, don’t report on image quality issues.

The images through the Meta Orion optics make one wonder what Meta was trying to prove with the Silicon Carbide waveguides. In the end, it proved that with a lot of money, they could produce a low-quality image. There seems to be a lot of “non-invented-here” and a desire to do interesting research. I’m all for big companies doing interesting research, but when it comes to making something for demonstration, I think they would be better off using the best available technology.

It reminds me of Microsoft’s Hololens 2 program, which spends huge amounts of money to produce a terrible image with a laser-scanning display (see my series on Hololens 2 image quality problems) with their “butterfly” diffractive waveguide. Below is a comparison of the Hololens 2 with laser scanning and diffractive waveguides to Lumus’s Maximus with reflective waveguides using LCOS microdisplays (from Exclusive: Lumus Maximus 2K x 2K Per Eye, >3000 Nits, 50° FOV with Through-the-Optics Pictures). The image on the right likely cost hundreds of millions more to develop.

Karl Guttag
Karl Guttag
Articles: 296

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading