Meta Orion AR Glasses (Pt. 1 Waveguides)

Introduction

While Meta’s announced Orion prototype AR Glasses at Meta Connect made big news, there were few technical details beyond it having a 70-degree field of view (FOV) and using Silicon Carbide waveguides. While they demoed to the more general technical press and “influencers,” they didn’t seem to invite the more AR and VR-centric people who might be more analytical. Via some Meta patents, a Reddit post, and studying videos and articles, I was able to tease out some information.

This first article will concentrate on Orion’s Silicon Carbide diffractive waveguide. I have a lot of other thoughts on the mismatch of features and human factors that I will discuss in upcoming articles.

Wild Enthusiasm Stage and Lack of Technical Reviews

In the words of Yogi Berra, “It’s like deja vu all over again.” We went through this with the Apple Vision Pro, which went from being the second coming of the smartphone to almost disappearing earlier this year. This time, a more limited group of media people has been given access. There is virtually no critical analysis of the display’s image quality or the effect on the real world. I may be skeptical, but I have seen dozens of different diffractive waveguide designs, and there must be some issues, yet nothing has been reported. I expect there are problems with color uniformity and diffraction artifacts, but nothing was mentioned in any article or video. Heck, I have yet to see anyone mention the obvious eye glow problem (more on this in a bit).

The Vergecast podcast video discusses some of the utility issues and their related video, Exclusive: We tried Meta’s AR glasses with Mark Zuckerberg, which gives some more information about the experience. Thankfully, unlike Meta or any other (simulated) through-the-optics videos, The Verge clearly marked the videos as “Simulated” (screen capture on the right).

As far as I can tell, there are no true “through-the-optics” videos or pictures (likely at Meta’s request). All the images and videos I found that may look like they could have been taken through the optics have been “simulated.”

Another informative video was by Norm Chan of Adam Savages Tested, particularly in the last two-thirds of the video after his interview with Meta CTO Andrew Bosworth. Norm discussed that the demo was “on rails” with limited demos in a controlled room environment. I’m going to quote Bosworth a few times in this article because he added information; while he may have been giving some level of marketing spin, he seems to be generally truthful, unlike former Hololens 2 leader Alex Kipman, who was repeatedly dishonest in his Hololens 2 presentation (which I documented in several articles including Hololens 2 and why the resolution math fails, and Alex Kipman Fibbing about the field of view, Alex Kipman’s problems at Microsoft with references to other places where Kipman was “fibbing,” and Hololens 2 Display Evaluation (Part 2: Comparison to Hololens 1) or input “Kipman” on this blog’s search feature)

I’m not against companies making technology demos in general. However, making a big deal about a “prototype” and not a “product” at Meta Connect rather than at a technical conference like Siggraph indicates AR’s importance to Meta. It invites comparisons to the Apple Vision Pro, which Meta probably intended.

It is a little disappointing that they also only share the demos with selected “invited media” that, for the most part, lack deep expertise in display technology and are easily manipulated by a “good” demo (see Appendix: “Escape from a Lab” and “Demos Are a Magic Show”). They will naturally tend to pull punches to keep access to new product announcements from Meta and other major companies. As a result, there is no information about the image quality of the virtual display or any reported issues looking through the waveguides (which there must be).

Eye Glow

I’ve watched hours of videos and read multiple articles, and I have yet to hear anyone mention the obvious issue of “eye glow” (front projection). They will talk about the social acceptance of them looking like glasses and being able to see the person’s eyes, but then they won’t mention the glaring problem of the person’s eyes glowing. It stuck out to me because they didn’t mention the eye glow issue, evident in all the videos and many photos.

Eye glow is an issue that diffractive waveguide designers have been trying to reduce/eliminate for years. Then there are Lumus reflective waveguides with inherently little eye glow. Vuzix, Digilens, and Dispelix make big points about how they have reduced the problem with diffractive waveguides (see Front Projection (“Eye Glow”) and Pantoscoptic Tilt to Eliminate “Eye Glow”). However, these diffractive waveguide designs with greatly reduced eye glow issues have relatively small (25-35 degree) FOVs. The Orion design supports a very wide 70-degree FOV while trying to make it fit the size of a “typical” (if bulky) glasses frame; I suspect that the design methods to meet the size and FOV requirements meant that the issue of “eye glow” could not be addressed.

Light Transmission (Dimming?)

The transmissivity seems to vary in the many images and videos of people wearing Orions. It’s hard to tell, but it seems to change. On the right, two frames switch back and forth, and the glasses darken as the person puts them on (from video Orion AR Glasses: Apple’s Last Days)

Because I’m judging from videos and pictures with uncontrolled lighting, it’s impossible to know the transmissivity, but I can compare it to other AR glasses. Below are the highly transmissive Lumus Maximus glasses with greater than 80% transmissivity and the Hololens 2 with ~40% compared to the two dimming levels of the Orion glasses.

Below is a still frame from a Meta video showing some of the individual parts of the Orion glasses. They appear to show unusually dark cover glass, a dimming shutter (possibly liquid crystal) with a drive circuit attached, and a stack of flat optics with the waveguide with electronics connected to it. In his video, Norm Chen stated, “My understanding is the frontmost layer can be like a polarized layer.” This seems consistent with what appears to be the cover “glass” (which could be plastic), which looks so dark compared to the dimming shutter (LC is nearly transparent as it only changes the polarization of light).

If it does use a polarization-based dimming structure, this will cause problems when viewing polarization-based displays (such as LCD-based computer monitors and smartphones).

Orion’s Unusual Diffractive Waveguides

Axel Wong‘s analysis of Meta Orion’s Waveguide, which was translated and published on Reddit as Meta Orion AR Glasses: The first DEEP DIVE into the optical architecture, served as a starting point for my study of the Meta Orions optics, and I largely agree with his findings. Based on the figures he showed, his analysis was based on Meta Platforms’ (a patent holding company of Meta) US patent application 2024/0179284. Three figures from that application are shown below.

[10-08-2024 – Corrected the order of the Red, Green, and Blue inputs in Fig 10 below]

Overlapping Diffraction Gratings

It appears that Orion uses waveguides with diffraction gratings on both sides of the substrate (see FIG. 12A above). In Figure 10, the first and second “output gratings” overlap, which suggests that these gratings are on different surfaces. Based on FIGs 12A and 7C above, the gratings are on opposite sides of the same substrate. I have not seen this before with other waveguides and suspect it is a complicated/expensive process.

Hololens 1

As Alex Wong pointed out in his analysis, supporting such a wide FOV in a glass form factor necessitated that the two large gratings overlap. Below (upper-left) is shown the Hololens 1 waveguide, typical of most other diffractive waveguides. It consists of a small input grating, a (often) trapezoidal-shaped expansion grating, and a more rectangular second expansion and output/exit grating. In the Orion (upper right), the two larger gratings effectively overlap so that the waveguide fits in the eyeglasses form factor. I have roughly positioned the Hololens 1 and Orion waveguides at the same vertical location relative to the eye.

Also shown in the figure above (lower left) is Orion’s waveguide wafer, which I used to generate the outlines of the gratings, and a picture (lower right) showing the two diffraction gratings in the eye glow from Orion.

It should be noted that while the Hololens 1 has only about half the FOV of the Orion, the size of the exit gratings is similar. The size of the Hololens 1 exit grating is due to the Hololen 1 having enough eye relief to support most wearing glasses. The farther away the eye is from the grating, the bigger the grating needs to be for a given FOV.

Light Entering From the “wrong side” of the waveguide

The patent application figures 12A and 7C are curious because the projector is on the opposite side of the waveguide from the eye/output. This would suggest that the projectors are outside the glasses rather than hidden in the temples on the same side of the waveguide as the eye.

Meta’s Bosworth in The WILDEST Tech I’ve Ever Tried – Meta Orion at 9:55 stated, “And so, this stack right here [pointing to the corner of the glasses of the clear plastic prototype] gets much thinner, actually, about half as thick. ‘Cause the protector comes in from the back at that point.”

Based on Bosworth’s statement, some optics route the light from the projectors in the temples to the front of the waveguides, necessitating thicker frames. Bosworth said that the next generation’s waveguides will accept light from the rear side of the waveguide. I assume that making the waveguides work this way is more difficult, or they would have already done it rather than having thicker frames on Orion.

However, Bosworth said, “There’s no bubbles. Like you throw this thing in a fish tank, you’re not gonna see anything.” This implies that everything is densely packed into the glasses, so other than saving the volume of the extra optics, there may not be a major size reduction possible. (Bosworth referenced Steve Jobs Dropping an iPod prototype in water story to prove that it could be made smaller due to the air bubbles that escaped)

Disparity Correction (Shown in Patent Application but not in Orion)

Meta’s application 2024/0179284, while showing many other details of the waveguide, is directed to “disparity correction.” Bosworth discusses in several interviews (including here) that Orion does not have disparity correction but that they intend to put it in future designs. As Bosworth describes it, the disparity correction is intended to correct for any flexing of the frames (or other alignment issues) that would cause the waveguides (and their images relative to the eyes) to move. He seems to suggest that this would allow Meta to use frames that would be thinner and that might have some flex to them.

Half Circular Entrance Gratings

Wong, in the Reddit article, also noticed that small input/entrance gratings visible on the wafer looked to be cut-off circles and commented:

However, if the coupling grating is indeed half-moon shaped, the light spot output by the light engine is also likely to be this shape. I personally guess that this design is mainly to reduce a common problem with SRG at the coupling point, that is, the secondary diffraction of the coupled light by the coupling grating.

Before the light spot of the light engine embarks on the great journey of total reflection and then entering the human eye after entering the coupling grating, a considerable part of the light will unfortunately be diffracted directly out by hitting the coupling grating again. This part of the light will cause a great energy loss, and it is also possible to hit the glass surface of the screen and then return to the grating to form ghost images.

Single Waveguide for all three colors?

Magic Leap Application Shown Three Stacked Waveguides

The patent application seems to suggest that there is a single (double-sided) waveguide for all three colors (red, green, and blue). Most larger FOV full-color diffractive AR glasses will stack three (red, green, and blue—Examples Hololens One and Magic Leap 1&2) or two waveguides (red+blue and blue+green—Example Hololens 2). Dispelix has single-layer, full-color diffractive waveguides that go up to 50 degrees FOV.

Diffraction gratings have a line spacing based on the wavelengths of light they are meant to diffract. Supporting full color with such a wide FOV in a single waveguide would typically cause issues with image quality, including light fall-off in some colors and contrast losses. Unfortunately, there are no “through the optics” pictures or even subjective evaluations by an independent expert as to the image quality of Orion.

Silicon Carbide Waveguide Substrate

The idea of using silicon carbide for Waveguides it not unique to Meta. Below is an image from GETTING THE BIG PICTURE IN AR/VR, which discusses the advantages of using high-index materials like Lithium Niobate and Silicon Carbide to make waveguides. It is well known that going to a higher index of refraction substrates supports wider FOVs, as shown in the figure below. The problem, as Bosworth points out, is that growing silicon carbide wafers are very expensive. The wafers are also much smaller, enabling fewer waveguides per wafer. From the pictures of Meta’s wafers, they only get four waveguides per wafer, whereas there can be a dozen or more diffractive waveguides made on larger and much less expensive glass wafers.

Bosworth says “Nearly Artifact Free” and with Low “Rainbow” capture

Examples of “Rainbow Artifacts” from Diffractive Wavguides

A common issue with diffractive waveguides is that the diffraction gratings will capture light in the real world and then spread it out by wavelength like a prism, which creates a rainbow-like effect.

In Adam Savage’s Tested interview (@~5:10), Bosworth said, “The waveguide itself is nano etched into silicon carbide, which is a novel material with a super high index of refraction, which allows us to minimize the Lost photons and minimize the number of photons we capture from the world, so it minimizes things like ghosting and Haze and rainbow all these artifacts while giving you that field of view that you want. Well it’s not artifact free, it’s very close to artifact-free.” I appreciate that while Bosworth tried to give the advantages of their waveguide technology, he immediately corrected himself when he had overstated his case (unlike Hololens’ Kipman as cited in the Introduction). I would feel even better if they let some independent experts study it and give their opinions.

What Bosworth says about rainbows and other diffractive artifacts may be true, but I would like to see it evaluated by independent experts. Norm said in the same video, “It was a very on-rails demo with many guard rails. They walked me through this very evenly diffused lit room, so no bright lights.” I appreciate that Norm recognized he was getting at least a bit of a “magic show” demo (see appendix).

Wild Enthusiasm Stage and Lack of Technical Reviews

In the words of Yogi Berra, “It’s like deja vu all over again.” We went through this with the Apple Vision Pro, which went from being the second coming of the smartphone to almost disappearing earlier this year. This time, a more limited group of media people has been given access. There is virtually no critical analysis of the display’s image quality or the effect on the real world. I may be skeptical, but I have seen dozens of different diffractive waveguide designs, and there must be some issues, yet nothing has been reported. I’m expecting there to be problems with color uniformity and diffraction artifacts, but nothing was mentioned.

Strange Mix of a Wide FOV and Low Resolution

There was also little to no discussion in the reviews of Orion’s very low angular resolution of only 13 pixels per degree (PPD) spread over a 70-degree FOV (a topic for my next article on Orion). This works to about a 720- by 540-pixel display resolution.

Several people reported seeing a 26PPD demo, but it was unclear if this was a form factor or a lab-bench demo. Even 26PPD is a fairly low angular resolution.

Optical versus Passthough AR – Orion vs Vision Pro

Meta’s Orion demonstration is a declaration that optical AR (e.g., Orion) and non-camera passthrough AR, such as Apple Vision Pro, are the long-term prize devices. It makes the point that no passthrough camera and display combination can come close to competing with the real-world view in terms of dynamic range, resolution, biocular stereo, and infinite numbers of focus depths.

As I have repeatedly pointed out in writing and presentations, optical AR prioritizes the view of the real world, while camera passthrough AR prioritizes the virtual image view. I think there is very little overlap in their applications. I can’t imagine anyone allowing someone out on a factor floor or onto the streets of a city in a future Apple Vision Pro type device, but one could imagine it with something like the Meta Orion. And I think this is the point that Meta wanted to make.

Conclusions

I understand that Meta was demonstrating, in a way, “If money was not an obstacle, what could we do?” I think they were too fixated on the very wide FOV issue. I am concerned that the diffractive Silicon Carbide waveguides are not the right solution in the near or long term. They certainly can’t have a volume/consumer product with a significant “eye glow” problem.

This is a subject I have discussed many times, including in Small FOV Optical AR Discussion with Thad Starner and FOV Obsession. They have the worst of all worlds in some ways, with a very large FOV and a relatively low-resolution display; they block most of the real world for a given amount of content. With the same money, I think they could have made a more impressive demo with exotic waveguide materials that didn’t seem so far off in the future. I intend to get more into the human factors and display utility in this series on Meta Orion.

Appendix: “Demos Are a Magic Show”

Seeing the way Meta introduced Orion and hearing of the crafted demos they gave reminded me of one of my earliest blog articles from 2012 call Cynics Guide to CES – Glossary of Terms which gave warning about seeing demos.

Escaped From the Lab

Orion seems to fit the definition of an “escape from the lab.” Quoting from the 2012 article:

“Escaped from the lab” – This is the demonstration of a product concept that is highly impractical for any of a number of reasons including cost, lifetime/reliability, size, unrealistic setting (for example requires a special room that few could afford), and dangerous without skilled supervision.  Sometimes demos “escape from the lab” because a company’s management has sunk a lot of money into a project and a public demo is an attempt to prove to management that the concepts will at least one day appeal to consumers.

I have used this phrase a few times over the years, including The Hololens 2 (Hololens 2 Video with Microvision “Easter Egg” Plus Some Hololens and Magic Leap Rumors), which was officially discontinued this month, although it has long since been seen as a failed product. I also commented (in Magic Leap Review Part 1 – The Terrible View Through Diffraction Gratings – see my Sept. 27, 2019 comment) that the Magic Leap One was “even more of a lab project.”

Why make such a big deal about Orion, a prototype with a strange mix of features and impractically expensive components? Someone(s) is trying to prove that the product concept was worth continued investment.

Magic Show

I also warned that demos are “a magic show.”

A Wizard of Oz (visual) – Carefully controlling the lighting, image size, viewing location and/or visual content in order to hide what would be obvious defects.   Sometimes you are seeing a “magic show” that has little relationship to real world use.

I went into further detail in this subject in my early coverages of the Hololens 2 in the section, “Demos are a Magic Show and why are there no other reports of problems?“:

I constantly try and remind people that “demos are a magic show.” Most people get wowed by the show or being one of the special people to try on a new device. Many in the media may be great at writing, but they are not experts on evaluating displays. The imperfections and problems go unnoticed in a well-crafted demo with someone that is not trained to “look behind the curtain.”

The demo content is often picked to best show off a device and avoid content that might show flaws. For example, content that is busy with lots of visual “noise” will hide problems like image uniformity and dead pixels. Usually, the toughest test patterns are the simplest, as one will immediately be able to tell if something is wrong. I typically like patterns with a mostly white screen to check for uniformity and a mostly black screen to check for contrast, with some details in the patterns to show resolution and some large spots to check for unwanted reflections. For example, see my test patterns, which are free to download. When trying on a headset that supports a web browser, I will navigate to my test pattern page and select one of the test patterns.

Most of the companies that are getting early devices will have a special relationship with the manufacturer. They have a vested interest in seeing that the product succeeds either for their internal program or because they hope to develop software for the device. They certainly won’t want to be seen as causing Microsoft problems. They tend to direct their negative opinions to the manufacturer, not public forums.

Only with independent testing by people with display experience using their own test content will we understand the image quality of the Hololens 2.

Karl Guttag
Karl Guttag
Articles: 297

25 Comments

  1. Why do you think they went with silicon carbide ? Magic Leap 2 has achieved the same 70 degree diagonal field of view with high index glass. (With admittedly a lot more visual artifacts and around only 10% light transmissivity) but still..

    I’ve heard norm and many others say that they used SiC to achieve a 70 degree FOV in a “glasses form factor”, but exactly what benefits did it afford them which allowed them to achieve this? Did it let them make the actual waveguides themselves smaller? Are SiC waveguides generally more efficient than ones made from high index glass, letting the device draw less power which allowed for smaller batteries inside the arms of the glasses? I guess I’m just not really sure how using SiC correlates to a wide FOV in a smaller form factor vs very high index 2.0 glass like the kind ML2 uses.

    My other question is why do you think they went with diffractive vs reflective waveguides? Couldn’t they have made reflective waveguides from SiC as well or is that not feasible? It’s hard to think of any pros for using diffractive waveguides as oppose to reflective ones. They have worse efficiency, much worse eye glow, significantly worse image quality, more visual artifacts etc.

    Also, I spoke with a couple people who used Orion a few days ago and asked many of the same questions that you did. I myself was also curious about the transmissivity and if looking through the glasses significantly dimmed you view of the real world (as it appears from the outside) and was told that

    “the lenses were very clear, very much like wearing normal glasses”

    When I asked about color uniformity when displaying white, I was also told by someone

    “I don’t recall white standing out in any particular way, the color uniformity looked really good! There was the rainbow effect around the edges but it wasn’t particularly distracting or problematic.”

    So take from both of those comments what you will..

    I have to say though that I do disagree with your comments on the whole “FOV obsession” thing

    “They have the worst of all worlds in some ways, with a very large FOV and a relatively low-resolution display; they block most of the real world for a given amount of content.”

    I’ve seen you make this analogy in the past where you claim that the most optimum seat in the movie theatre only takes up about “35-40 degrees of your horizontal fov” and due to that fact you seemingly infer/insinuate that anything beyond that is excessive for AR glasses or has a bad cost to benefit ratio. That seems like a pretty poor analogy because if you turn your head in a movie theatre the screen doesn’t start to get cut out of your field of vision.

    If the long term goal is full AR Glasses with SLAM that have the ability to anchor virtual objects in space (which is clearly the stated objective of Meta and most other tech companies I’d assume) then you want as wide an FOV as possible. The kinds of use cases that Full AR glasses would cater to (large virtual screens that can be resized and placed anywhere in your environment, 3D telepresence, visual instructions that teach you how to do things (play the piano or other instruments, teach you how to cook something etc) would necessitate a very large FOV to make those use cases viable. Nobody wants to be watching a YouTube video on a large virtual screen and have it start to get cut out of their field of vision the second they turn their head a couple inches.

    Obviously resolution and brightness are very important factors as well, but there really is a minimum viable FOV for these kinds of “full AR” use cases, and it seems to be around 70-80 degrees. And considering it’s one if not the hardest problem to solve, it makes sense that they wanted to tackle it first before focusing on other things.

    As Rahul Prasad, the product lead for Orion said “It was a science problem to get to a large field of view, the next phase is an engineering problem to get to higher resolution, higher brightness, and lower cost.”

    • The reasons given by Meta for using SIC was to lower artifacts both with respect to the virtual image and light capture (rainbow) artifacts. As the article discusses, Meta also used overlapping expansion and exit-to-the-eye diffraction gratings. This seems to be the key to keeping it in the glasses-like form factor, but I don’t know if it would be possible with high index glass waveguides. I think in part SiC was an R&D experiment to see what kind of benefits SiC would give in the future. Usually glass with very high index starts to yellow (I don’t know about ML2 as they are so dark).

      Regarding reflective waveguides, I know from information provide by Schott (see: https://kguttag.com/wp-content/uploads/2024/10/Schott-Index-Glass-copy.jpg) that Reflective waveguides (ex., Lumus) require lower index of refraction for the same FOV. Alex Wong (who works for a Chinese optics company) in his Reddit article wrote that Meta is rumored to be working on a reflective waveguide model. At wide FOVs, reflective waveguides so far have much better image quality. Lumus claims they are 5 to 10 times more efficient.

      Thanks for the added information from people that have used it. You always have to take it with a grain of salt if people are not experience and seriously looking for things. I’m pretty sure Orion is at least 50% light blocking. That is just about the level where to see a difference you have to take them on and off as you won’t notice it if you just leave them on. Did they notice them dimming based on lighting?

      As I wrote, “demos are a magic show,” so typically they only let you see demos of things that look good. I’m wondering about what rainbow effect they are seeing. From Norm Chan’s comments, the room was set up to minimizes “rainbow problems.”

      As far as FOV goes, a lot depends on the application/use-model. I appreciate that you have read my articles on the subject and have an “informed” argument. I may be over-compensating for the “FOV obsession” that I think stems from it being comparatively easy to support wide FOVs. Perhaps ironically, I was called out by Thad Starner at AWE for wanting too big a FOV (https://kguttag.com/2024/06/29/awe-2024-panel-the-current-state-and-future-direction-of-ar-glasses/#thad-fov). Thad has been using AR glasses for about 30 years and has done a lot of research and studies in the field. It is also well known that the eye typically wants to stay within about 30 degrees horizontally. There is a case for Wide FOV where you want to alert the user to turn their head (example military where they want to display the source of gunshots). But one you get outside 30 degrees, most people, as you wrote, will turn their head. Typically the main reason for very wide FOVs is emersion. I think you may have missed the point (made strongly by Thad Starner), that if you end up blocking too much of the real world, particularly a person’s hands or something that might be dangerous, then you may miss the advantages of AR. The point is you want to convey information without blocking too much of the real world, otherwise you might as well have passthrough-AR (ex., Apple Vision Pro).

      If 70-80 degree FOV are requires for AR to be “viable” then we are likely decades away from having viable AR glasses. It is not just the optics, but “servicing” the FOV in terms of pixel and data content as well enough light and then considering the cost, size, power (both battery and more importantly heat). I think there are many used cases with FOVs at 50 and below.

      • Hi Karl,

        There is another waveguide company that I saw at AWE a couple of years ago, Oorym. They claimed that they have 70 degrees FOV. How were they able to achieve that?

      • I have been aware of Oorym for about 4 years and have written and talk about them several time including in https://kguttag.com/2024/04/11/mixed-reality-at-ces-ar-vr-mr-2024-part-2-mostly-optics/ and in in a video discussion
        Video discussion:https://youtu.be/LgcLmXycr3k?t=4342

        The simple answer is that I have seen their 50 degree prototype and I’m sure 70 degrees is possible with their approach.

        Oorym is founded by one of the founders of Lumus and has a great background in optics. It is small and thus it takes time to prove out their technology and their prototypes are somewhat crudely assembled. It would be interesting to see what they could do with a larger investment. The technology is much more efficient than diffractive or reflective (Lumus) waveguides and could even work with Micro-OLEDs. I look at it as a technology that is in between “waveguides” and freeform optics.

    • While high index glass can go up to a refractive index of 2, SiC is around 2.6 in the visible range, which is significantly different. Additionally, quite often with a high RI glass waveguide, the actually gratings are still patterned on an extra layer of polymer bonded to the glass substrate (usually with a lower RI), which results in a lower coupling efficiency. To put it simply, the efficiency of the grating is highly correlated with the RI contrast of the grating material, i.e. RI of the grating vs. RI of air or whatever filling material. With SiC, Meta is etching the grating structure directly on the SiC subtrate, which should results in much higher coupling efficiency, in addition to the bigger FoV which is supported by the waveguide itself.

      With that being said, I don’t believe that SiC-based tech would become mainstream due to the cost. At least for now, a 4-inch SiC wafer costs at least 500 USD (enough to make waveguide for one eye). And to directly etch gratings on it, probably you’d need EBL which is prohibitively expensive and slow for mass production.

      There’s indeed rumor that Meta is planning an actual product in 2025 based on LCOS and Lumus waveguide. Nevertheless, reflective waveguides have their own problems. One factor rarely mentioned is the production tolerance/yield. The way these embedded mirrors are produced is by stacking multiple coated glass substrate together, before they are cut with a 45 deg angle and polished. One problem is that you need to keep all coated surfaces perfectly parallel to each other. This gets exponentially challenging as the FoV/number of mirrors increases. I believe this is part of the reason why Lumus has been demonstrating relatively good performance but still is not getting into mainstream.

      It’s an interesting idea to use SiC for reflective waveguide, which will certainly increase the FoV. However, as SiC is one of the hardest materials on earth (inferior only to B4C and diamond), I’d imagine it’s very difficult to cut and polish after the stacking process.

  2. I was hoping for an update on the Orion from you Karl, thank you very much. Considering this long journey to a wide rollout of AR displays, do you anticipate that our current smartphones boosted by AI will continue to be the defacto wearable device for the next 5 years at least?

    I was very interested in the LetinAR technology and expected it to usher in this AR Display revolution, and was equally excited when new products in a seamless form factor emerged recently like the G1 glasses. https://www.scmp.com/tech/tech-trends/article/3280965/chinese-smart-glasses-maker-even-realities-puts-displays-first-alternative-vision-meta

    • I’m expecting to see a number of “AI/AR” glasses rolled out in the next year. I expect most of these will have 25 to 35 degree FOVs. Basically they will be trying to take the Meta Ray Ban glasses to the next level with a display. Most of these glasses will be using diffractive waveguides.

  3. Thanks Karl for another great analysis!

    I think Orion is an impressive research prototype of ‘what can we do if cost is not a limit’. However that is not the basis for a product. I’m sure they have parallel projects already working on something that can and will become a product eventually.

    If these glasses uses double sided gratings on Silicone Carbide then the stated cost of $10k overall seems way too low.

    • Meta is reportedly looking about $1B per month in mixed reality and most of the R&D is likely going into the AR development. That is $1M per 1,000 units per month. So the $10,000/unit is a arbitrary number; it all depends on how they allocate the cost. Best case that is the per unit incremental cost.

      I like to say “at some point they have to land the plane,” meaning that we don’t get to see what can and can’t be done until we see a functioning unit (and can fully evaluate it). I think the SiC waveguides are more of an R&D experiment for the future. But even if you substitute glass waveguides, then you have the low resolution. If they increase the resolution, then the processing and power goes up. Additionally, the MicroLEDs are not ready for prime time at higher resolutions.

  4. great analysis! any thoughts on Microsoft’s ending of the HoloLens 2? It’s apparent that enterprise customer who invested into developing products on this platform are not stuck without a paddle.

    • Hololens has be a “dead man walking” for years. They just kept putting off making it official. From what I hear, the IVAS program is (and was) also in big trouble and the Army is looking for a good way out.

  5. I know this is a tech blog but it’s hard not to get drawn into the larger picture / what’s motivating the gargantuan investment into this approach, and how this seemingly conflicts with the reality of the technical limitations.

    Because Zuck states it pretty explicitly now: Meta missed the boat on the mobile platform wars and therefore they will accept nothing short of a complete smartphone replacement. Smart glasses as a mere peripheral of a phone (e.g. Ray-Bans) do not address the core issue for them. As a peripheral, Apple/Google could (and apparently will) simply push the same concept but with better integration. Moreover, the apparent redundancy of the Orion compute puck + armband (vs devices that Apple/Google already have in people’s pockets) is all too conspicuous, and people won’t carry around two sets of these things.

    Norman Chan (in episode 1468 of Voices of VR) makes this point as well:
    > A phone? From Meta’s perspective, they were very clear. This device, the billions of dollars they’ve spent into this, the whole Reality Labs venture, is to surpass the phone, is to bypass the phone. Like, they are so frustrated that Ray-ban Metas don’t get the priority access that Apple first-party devices get for pairing or for image transfer that you have. If people don’t know about the Ray-Ban Metas, the way you offload your images to your phone, you have to connect to a Wi-Fi network generated by the glasses. and then transfer. It is the opposite of seamless. If Apple made the Ray-Ban metas, they would just be in iCloud and be on your phone. And that’s all regulation and policy, and that’s all a whole separate conversation, but this is one of the frustrations of Apple holding all the cards with the hardware” … “They want this, and that allows them to then price it as a laptop replacement, as a phone replacement. The directive is to make this a consumer product that someone would buy instead of buying a phone, which is a tall, tall order”

    So, seeing this “money is no issue” prototype, with all of its impracticalities, and still observing nearly all of the glaring AR issues you regularly detail on your blog, it’s just really hard to see them pulling off a phone replacement. Perhaps there are some more relegated and niche use-cases.

    As a product, it might perform better than VR (not a tall order given the poor retention) but my hunch is that minimalist and complementary smart glasses as a peripheral (like the Ray-Bans, at least in the product category’s nascent form) are a better bet as a mass market product. As for Zuckerberg’s phone nightmare, perhaps he should be investing some of those billions into anti-trust lobbying (this of course assumes he doesn’t merely want to be in the same extractive position himself)

    • Everything you wrote make sense. The problem with technology is that “intersection points” where someone has a chance to topple an existing dominant player don’t come a predictable intervals or in predictable ways. Generally there are no “safe plays” you have to figure it out before it happens. Even large companies can’t cover all bases and if they try they will likely not invest deeply enough in the winning approach. IBM didn’t miss the PC altogether, but they corporate culture didn’t know how to deal with it.

      In the case of the cell phone, Nokia bet on smaller (remember flip phones?) and cheaper (focusing on China) thinking that volume was everything. Yet today’s dominating smartphones are bigger and more expensive.

      Orion and Apple Vision Pro are what I call “plane landings” where we get to see the state of what is possible with huge investments. Unfortunately, they more confirm how far away the technology is from mass production. They don’t have just one issue, but many issues, and fixing any one issue makes the other issues worse in some way.

      I agree that Meta is in a tough position that they are left depended on Apple’s and Google’s “tender mercy.” I get why the “need” a new intersection point and they are trying to lead it, but sometimes the new intersection point is far in the future or in a different place/technology. Today “generative AI” is seen as a new intersection point, but it might be too diffuse with too many companies able to replicate AI. AI may replace a lot of jobs and let companies be more cost effective, but it may not be controllable by a few companies as we see so many companies developing their own AI technology; it may not be the barrier to entry like we saw with PC’s and smartphones.

  6. Very informative, I would love to know how you find the color ordering of the three in-coupling gratings.
    According to the patent, the color is in the reverse of what you have labeled on the image, I would like to know if you have other clues on that.

    Another thing is about the resolution “720- by 540-pixel” part, do you have any idea about the aspect ratio? or is it assumed to be 4:3? Because for a waveguide display that achieves a large FOV like this, I will assume the optical architecture is closer to Magic Leap 2, which has a taller aspect ratio.

    • I apologize, I made a mistake in labeling the Red, Green, and Blue in the Fig. 10. It was a simple matter of reading the patent and seeing they went in numerical order and then labeling them from top to bottom.

      Regarding the 720×540, it was more about picking an aspect ration to see what the resolution might be. I was not even concerned with orientation (tall vs. wide). I picked 4:3 as it is becoming a popular aspect ratio for AR and VR and it is in between 1:1 and 16:9. My main goal was to see if it was a Jade Bird Display 640×480 MicroLED and no matter which aspect ratio you pick the display appears to have more pixels than JBD’s 640×480. It is possible that Meta was “fudging” on the PPD. I don’t think it is JBD’s 1280×720 or 1920×1080 panels as I don’t think they would fit on the board Meta showed with optics. Meta claims it is their own design, but that mean they contracted with JBD for a custom design. From what I here, the deal with Plessey has fallen apart and I’m doubtful that they are making the complex LED structure with embedded mirrors of InfiniLED which they bought in 2016. But anything is possible when you are talking only 2K devices using (1K times 2).

  7. Regarding ‘dimming’ There seems to be two versions 13 and 26 ppd. The 26 PPD version outputs a much darker image. That’s from Norman Chan’s interview. I’m sorry if this is mentioned somewhere in the article. I’m not even half way through yet.

    • The “dimming” I was referring to in the article is ambient light dimming. I thing the current 13ppd glasses have dimming as I seems that how dark the user’s eyes keeps changing.

      Yes, I did mention the 26ppd in the article.

  8. Hey Karl,

    I’ve been following your blog for a while and enjoyed the latest article. My English is a bit rusty—while I understand it perfectly, my syntax isn’t great, so GPT helped me out. I’m not a native speaker.

    I have a couple of questions. First, do you think we’ll see AR glasses this decade with around 50° FOV, smaller than something like Orion, lighter, and using waveguides? I’d love something practical for everyday use, even if it’s not perfect. I know that many technologies follow a sigmoid curve of slow then exponential growth, but with AR, there seem to be so many variables that impact performance equally. Given that, if 50° FOV isn’t realistic, what can we expect in that space?

    Second, regarding VR: is Holocake-like tech with a lighter, more compact design and varifocal focus achievable this decade? I’m happy with the Quest 3, but I’d love something lighter. If that’s not possible, what might we expect in these areas instead?

    • We are going to see a number of products that are adding displays to “Audio AI Glasses” (essentially Meta Ray Bans with displays).

      50-degrees is likely possible in the next decade but it does get to be more difficult. As I stated in the article, you don’t want a wide FOV with low resolution such that you block the most of the real world when conveying information. But if you say double the diagonal FOV but keep the pixel density the same, then you have 4 times the pixels, 4 times the display data, more processing, more communications, more heat dissipation issues, more battery etc. Then you have the issue that once you have a wider FOV, there is the temptation/desire to support SLAM and other forms of recognition which adds more processing and power issues. As you indicated, the reason AR is moving slowly is that there are multiple problems to be solved and in trying to improve one issue, you end up hurting other issues.

      I think at 50-degrees, efficiency of the display and optics is of even greater concern. This may be a problem for diffractive waveguides.

      Varifocal is another tough issue. There have been many efforts to support varifocal from electro-mechanical (both moving lenses and “liquid” lenses) to electronic (liquid crystal). If all it took was money, it would be solved by now. It is a tougher problem than say cell phone cameras because the lenses are larger. I thought 5 years ago that they were getting close, but here we are 5 years later and it seems at least as far away. I think it may be one of those problems that looks easier than it is to solve. Another less obviously issue with varifocal is determining how to move focus based on eye vergence as the eyes dart around (saccade) and there will always be some lag – I like to say “you can know where the eyes are pointing, but not what the eye sees.”

      • So, it seems that the rumors about a lower-spec version of Orion in 2027 with 50° FOV and LCoS were a bit exaggerated given the current state of AR tech. Regarding varifocal displays, I understood the current focus was on holographic films backlit by coherent light (lasers). As far as I know, HoloLens 2 uses a laser per color to generate images, and, according to Abrash, the main challenge is the efficiency and power of these lasers.

        There was even talk that Mirrorlake could be built today, which gave me hope for more ergonomic headsets. Andrew (from Meta) said that while the volume of headsets might not shrink much(up to 2030), the weight could. But there’s no clear explanation yet on how they plan to achieve that.

        AR faces its challenges, but it reminds me of the journey of the wireless phone. It took nearly a century to miniaturize it enough to be practical, and back then, many were skeptical, saying it was almost impossible. Besides money and smart people, time is probably a crucial factor in developing breakthrough technologies. In engineering, there’s a saying that no matter how much money, staff, or logistics you throw at a project, you can’t speed it up indefinitely. And considering that each increase in complexity requires exponentially more effort, this might be beyond even the reach of tech giants. Not to mention, optics and physics are already incredibly sophisticated, and we’re working within very well-established principles.

      • I’m don’t know why the rumors of “reflective waveguide” (likely Lumus) with 50-degrees and LCOS by 2027 would be “exaggerated.” Based on what I have seen from Lumus, it should have better image quality than Orion with much higher pixel density (in the 40ppd range) and better overall image quality. The display will likely be as or more power efficient. They would still have all the system issues of Orion in terms of SLAM, processing, cameras mounting, communication, batteries, etc.

        If you are waiting on lasers and holograms, then you are likely in for a long wait. The problems are much bigger than cheaper lasers.

        For varifocal, using motors and lenses gives the best image quality. Using LC gives lighter weight. Meta Research has gone back and forth with each succeeding protype on the method. Its still unclear how well eye tracking will work to control varifocal for a wide range of people.

        It is a common fallacy to look backward at the history of a successful product and then apply that success to a new concept. For every concept that breaks through exponentially, there are vast numbers that never make it; the failures are soon forgotten. As I repeatedly point out, there are many interlocking issues that make a mass market AR product difficult. Trying to make one that solves everything is beyond challenging. It will be interesting to see in the next year or so the many efforts to add “small” (25-35 degree) displays to AI/AR glasses. I’m also more optimistic about “enterprise,” medical, and military use of AR (but not IVAS which was a boondoggle) where there are clear financial benefits to using the technology.

      • I think he interpreted you saying “50 degrees is likely possible in the next decade” in a wearable glasses form factor as meaning that it wouldn’t be possible by the end of this decade which is what he originally asked you. Hence why he said that the rumored 50 degree FOV AR glasses with SLAM using LCOS and reflective waveguides that Meta is planning on releasing in 2027 must be an
        “exaggeration”.

        Also, what is it exactly that you consider a “mass market AR product”? I see you use this term a lot and then go on to imply that such a device likely violates the laws of physics and will probably never happen at any point in the future no matter how distant (key word never) because of this.

        I have personally used Lumus’s Z Lens and Maximus prototypes and they have very good image quality. I think a pair of AR glasses with reflective waveguides (with all the benefits they enable, great image quality, very little eye glow or visual artifacts and extremely high light transmissivity) combined with the 70 degree FOV and good thermals of Orion could definitely replace using your phone for a lot of things, (especially for people who are already glasses wearers) and particularly for things like media consumption/watching YouTube etc.

        A big issue with the Vision Pro and Xreal style video glasses is that most people don’t want to take a headset on and off (or even have to put something different on at all like an Xreal type device), it’s too much of an inconvenience for the vast majority of people. I want to literally just be able to cast a YouTube video from my phone to my glasses that I’m already wearing, and then be able to resize it and pin it out in front of me wherever I want while I’m lying in bed or sitting on the couch and not have to hold my phone up in front of me.

        These kinds of “screen replacement” use cases only really make sense on something you’re already wearing all day (like a regular pair of glasses) not something you have to repeatedly take on and off like a headset. As a glasses wearer myself, I already have to wear glasses to watch a video on my phone anyway, so why wouldn’t I just cast it to my glasses and watch it on a virtual resizable screen that’s 10 times bigger if I already have to wear glasses all day to look at my phone to begin with?

        As I’ve already said, I think a pair of glasses using reflective waveguides with a 70 degree FOV would be serviceable enough for this (especially for glasses users who already have to wear glasses all day to look at screens anyway).

        I don’t think an Orion type device (that is maybe a little bit smaller and weighs slightly less) that uses reflective waveguides while having a 70 degree FOV and good thermals is impossible or violates any laws of physics, and imo that is all that is needed need for viable “mass market” full AR glasses.

        Orion has already achieved 2 out of 4 of those feats (having great thermals in a glasses form factor while supporting a wide field of view and SLAM). If you’re able to couple that with reflective waveguides (which are even more efficient) and slightly reduce the size and weight by 20-30 grams then you are essentially home imo.

        Obviously, as you correctly pointed out, even an Orion type device using reflective waveguides would need an increased res to achieve the same angular resolution as the Maximus/Zlens prototypes while supporting a 70 degree FOV, causing trade offs with thermals, battery size and weight etc. but those are tractable problems over time and are not “physics issues” per say in the same way optics are.

        Even if battery tech were to grind to a halt from this point forward, (and I don’t think it will) we would still be able to get better battery performance for the same sized batteries (or smaller) that can currently fit inside a glasses form factor device like Orion, and both simultaneously increase battery life and resolution/brightness over the coming decades due to increases in chip efficiency.

        That would leave optics as the only real “physics problem”, and if you made reflective waveguides with a 70 degree FOV (by using materials with a high index of refraction like lithium niobate, silicon carbide etc) while supporting around 40-50PPD then you have essentially solved the optics problem (as reflective waveguides have almost no eye glow, very little visual artifacts or dimming of the real world, great color uniformity, etc).

        Sure, we may not see anything like this on the market within the next 10 or 15 years, but I see no reason that the device with the specs I’m describing violates any “laws of physics”.

  9. Great article as usual! I would just add one thing: demos are not always for technical reasons. By showcasing Orion, Meta gave the impression it is far beyond Apple in the AR race and managed to create a bit of hype around XR again. This is good for Meta’s business and also for our business, because more attention for XR means more work for all of us 🙂
    The risk is to give the impression that it is arriving very soon… some people have already understood that Orion is close, while true AR glasses are still many years away.

  10. […] Meta Orion AR Glasses (Pt. 1 Waveguides) – KGOnTechKarl Guttag nous propose une exploration en profondeur du dispositif optique des lunettes Orion de Meta. Comme le précise l’auteur, l’entreprise est encore très discrète sur les éléments techniques de cet appareilPublié le : 12 octobre 2024 – 8 h 47 min […]

Leave a Reply to skarredghostCancel reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading