Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Two recent LinkedIn articles by Axel Wong inspired this article. The first, Decoding the Optical Architecture of Meta’s Next-Gen AR Glasses: Possibly Reflective Waveguide—And Why It Has to Cost Over $1,000, discussed the use of reflective/geometric (Lumus, or a Lumus Clone) Waveguides using an LCOS display in Meta’s Hypernova AR/AI glasses. Axel’s article was partially based on information from an article by Bloomberg’s Mark Gurman and on information he gleaned from a supplier in China. Axel’s second article, Google’s New AR Glasses: Optical Design, Microdisplay Choices, and Supplier Insights, said that the Google AR glasses were shown in a recent TED Talk by Shahram Izadi, which likely used a diffractive waveguide by Applied Materials.
Shahram Izadi, of Google, held up an optical engine in the video that he said was full color. Axel’s article on the Google AI/AR glasses speculated that it uses either LCOS or an X-Cube MicroLED engine, but seemed to favor it being LCOS. Axel also wrote that the waveguide was likely from Applied Materials. As will be discussed, I think that it is highly likely that Google’s AR glasses are using Avegant’s 20-degree LCOS optical engine that they presented at SPIE’s AR/VR/MR 2025.
The Meta and Google AR/AI glasses have some major similarities: they are both monocular (single display for the right eye), use a Full Color LCOS MicroDisplay, and use waveguides. It has been reported that Meta is putting the display in the lower right corner of the user’s view, whereas it appears from Google’s TED talk that the display is roughly in the center vertically. Perhaps the biggest difference is that Meta is likely using a Lumus reflective waveguide, perhaps with a 30° FOV, whereas Google is using a diffractive waveguide with likely a 20° FOV.

This blog has a tradition of identifying technology inside Google’s AR glasses. Back in February 2013, I correctly identified a Himax LCOS device in the original Google Glass. Stock market analyst Mark Gomes wrote an article on Seeking Alpha based on my findings, causing Himax’s stock to jump by $223 M. See my 2013 article Google Glass and Himax Whirlwind.
I’m off to SID Display Week this week. If you have a product or concept or want to discuss a topic from this blog at Display Week 2025, please email me at meet@kgontech.com. I’m planning on being there every day of the exhibits (May 13-15), but my calendar is getting pretty full. I may also have time to meet before the exhibition on Monday, May 12th. I’ve partnered with SID to share my insights from Display Week (DW) — past, present, and future. If you’re planning to attend Display Week, SID has provided the code DW25KARL for a free exhibit hall pass.
While I contacted Lumus and Avegant in preparation for this article, neither company said they could not comment. Therefore, this article is based on the available public evidence, including that provided in Axel Wong’s articles and Bloomberg’s Mark Gurman’s article on Meta Hypernova, plus other sources and my experience. While I think I am correct, I’m not 100% sure.
I should also note that while Meta’s Hypernova is expected to be a product for sale in 2025, it is unclear whether the Google AR glasses are anything more than a lab prototype or perhaps a reference design and developer platform for their Android XR partners.
I should also point out that I have not had a chance to evaluate either Meta’s Hypernova or Google’s XR Glasses. Having followed Lumus’s and Avegant’s developments, I have some idea of what the images may look like, but I have not seen the complete designs.
Meta and Google’s devices are strikingly similar in that they are both Monocular (single-eyed) and likely use LCOS as the display device.
The most obvious reasons to go monocular are to save on cost and weight. Another advantage is that there is no need for IPD adjustment. Mark Gurman has further written that the display is in the “lower right corner of the right lens.”
Long-time AR glasses user, researcher, and Google advisor Thad Starner has advocated for a monocular display on the lower outside (discussed in AWE 2024 Panel: The Current State and Future Direction of AR Glasses), exactly what Meta Hypernova appears to be doing. The advantage of the lower outside corner for the virtual image is that it keeps it out of the way of the user’s forward vision. Thad makes the point that you don’t want a message popping up and blocking your vision at a critical time. From a human factors point of view, if the display is not going to be centered horizontally, then it is best to have it below center, as humans can look down much more easily than look up.
Having biocular displays pretty much forces them to be in the center, but they could be in the lower half if there is a worry of blocking forward vision. Still, if they are not in the center of the forward view, it will be uncomfortable to use for long periods, and the wearer will obviously be looking down. So I would expect that if, as Gurman’s article stated, Meta was planning on a biocular Hypernova 2, they would center the virtual image in the user’s forward view.
With all the discussion about MicroLEDs for AR glasses today, it may seem strange that both Meta and Google use LCOS. There are many sound technical reasons for choosing LCOS, particularly for a full-color display, including:
For many of the reasons above, while LCOS does not get the media attention and corporate investment of MicroLEDs, it is likely to remain the best option for full-color AR headsets for some time (perhaps a long time).
The chart below on the left was presented by Meta’s Hartlove at Display Week 2024 and SID AR/VR/MR 2025. Their chart shows LCOS as the “Ready Technology,” MicroLEDs as the “Anticipated Technology,” and Laser Scanning as the “Final Solution” (with a question mark). Below right is a 2023-04-27 SPIE AR/VR/MR fireside chat slide interviewing Trilite, presented by Bernard Kress of Google, that contrasts very dramatically. Kress’s slide shows power efficiency versus Average Pixel Lit (same as AVP); similar to my comment above, Kress shows the MicroLED/LCOS power crossover at ~12% AVP (not true today for full color, maybe in the future), but then he shows that with “MiniLED local dimming” the crossover could move out dramatically. It’s also notable that Kress, when he was a technical leader on Microsoft HoloLens, worked with laser scanning on the HoloLens 2.


Thus, we seem to have highly contrasting views on the future of AR display technology. A cynical person might suggest that Hartlove’s roadmap may be meant to entice Meta’s management to invest in their R&D programs, more than it is tied to business reality. Kress’s (Google) chart may reflect the scars from his work on HoloLens 2.
I used some information published by Axel on Reddit in October 2024 in my article, Meta Orion AR Glasses (Pt. 1 Waveguides). In that same Reddit article, Axel wrote (with my bold highlighting):
There were rumors before that Meta would launch new glasses with a 2D reflective (array) waveguide optical solution and LCoS optical engine in 2024-2025. With the announcement of Orion, I personally think this possibility has not disappeared and still exists. After all, Orion will not and cannot be sold to ordinary consumers. Meta may launch another reduced-spec version of reflective waveguide AR glasses for sale, which is still an early adopter version for developers or geeks, but it is speculated that this reflective waveguide version is also likely to be a transition, and will eventually return to surface relief grating (SRG) diffraction waveguides.
It seems to be a bit of an open secret, at least in China, that Meta is likely using a Lumus reflective waveguide in a prototype, if not a final product. I have seen several Chinese companies try to copy Lumus’s reflective waveguides over the years, and the image quality has not been very good compared to Lumus’s. Furthermore, I can’t see why Meta would risk a patent challenge from Lumus if they went with a copy (these points are also made in Axel’s LinkedIn article).
I have been following Lumus’s progress since before I started this blog in 2011. In January at SPIE AR/VR/MR 2025, I met with Lumus to see their newly announced Z30 30° FOV Z-Lens at SPIE AR/VR/MR. They were using a 720 x 720 pixel LCOS microdisplay.
I should note that the Z30 waveguide I have seen had the display centered vertically and was part of a binocular design. The image was not below center or off to the right, as Bloomberg’s Mark Gurman’s Hypernova article reported. And thus, it was not exactly the Z30, but a customized version.
Shown below, left is the 30° Z30 waveguide with the attached projector engine with its older Maximus and Z50 50° waveguides. While the Maximus and Z50 are further away, you should be able to tell that the Z30 projector engine is much smaller. Lumus, in their January announcement, said the Z30 prototype outputs to the eye are>3,000 Nits/WattLED, and Lumus told me that they expect to more than double this efficiency with design improvements. [See my Caution on Using Nits/WattLED in the Appendix, as they are not comparable between companies]


Above right is a quick handheld picture I took through the Z30; I doubt I had the camera aligned well (I was holding the glasses in one hand and the camera in the other), and the background was not solid black (the background is causing uniformity variation). Still, the color uniformity is much better than that of the typical diffractive waveguide.
I missed getting a picture of myself wearing the Z30-based glasses, which have smaller projectors and frames. The picture below is of me wearing the Z50 (50-degree) prototypes. The Z50 and Z30 have only very slight “eye glow” and are about 90% transparent. The very slight eye glow that is seen in the pic is for a nearly full white bright image, and not a typical image.


A key issue for any consumer AR glasses is to incorporate prescription correction. Lumus’s Z-Lens waveguides propagate TIR light in the waveguide at a shallower angle than diffractive waveguides, which enables not only the use of significantly lower index of reaction glass, but also the direct bonding of push-pull lenses to change the virtual image focus distance and prescription correction.
Due to surface features and to maintain TIR, diffraction-type waveguides require an air gap. This adds complexity and makes the combination of lenses and waveguides thicker. Having surfaces with an air gap means there are additional surfaces that can cause ghosts and require anti-reflective coatings to reduce them. There is not enough information available to determine whether the Google XR Glasses with diffraction waveguides will address prescription correction.
Lumus has partnered with AddOptics to provide the push-pull lenses that incorporate prescription correction. Lumus demonstrated the combination to me at AWE 2024.

Meta’s Orion AR Glasses demonstration in late 2024 used very expensive/hard-to-manufacture Silicon Carbide (SiC) Waveguides. As discussed in Meta Orion AR Glasses (Pt. 1 Waveguides), they used SiC because of its high index of refraction, which is required to support a 70-degree FoV with diffractive waveguides. At AR/VR/MR 2025, Lumus was claiming that, due to the shallower TIR angle of the Z-Lens, they can support a 70° with their glass waveguides.
I’ve seen many negative comments about Meta’s Hypernoval’s rumored price range of $1,000 to $1,400. As I wrote about the Apple Vision Pro, People Who Say the AVP’s $3,499 price is too high lack historical perspective. I’m not as worried about the price as I am about the functionality and usefulness. If the product is useful and finds a market, there is no reason that the cost should not be reduced significantly. The price reflects that Hypernova is testing the waters more than a full-blown consumer product.

Alex Wong’s LinkedIn article on the Google XR TED talk strongly suggested that Google is using waveguides from both Applied Materials (AMAT) and “one based in Shanghai, China.” In his article, Alex was unsure about the optical engine. As will be discussed, I believe Avegant designed the optical engine based on its size and shape (to be discussed below), and due to the recent partnership between AMAT and Avegant.
In their respective presentations at AR/VR/MR 2025, both Avegant and AMAT discussed their new partnership. Avegant presented several slides discussing their new 20° FOV monocular LCOS engine in a development kit jointly developed with AMAT.
Below are the Avegant/AMAT development kit X-ray diagram (top) and a still frame (bottom) from Google’s video showing an exploded view of their XR glasses. They are remarkably similar. I have also added an inset of just the optical engine part of Avegant’s development kit near Google’s optical engine. While the projector portion may look bent in the Google exploded view, this may be simple due to the arrangement of the electrical components around the optics to fit Google’s frames better.

On both diagrams above, the projector is on the side of the right eye (left side of the diagram). Avegant’s diagram shows microphones on the side of the left eye, whereas Google’s diagram has a hole for a camera. I’m a big believer that a camera is going to be essential for any “AR with AI” glasses. The location of the image on one side and the camera on the opposite side of the frames, while it makes sense in terms of fitting them into the frame, is poor in terms of using the display as a camera viewfinder due to parallax.
I have added labels pointing to some of the components in Google’s design. They appear to have a set of push-pull lenses on either side of the waveguide, but I don’t know if they have or will support prescription correction. All in all, it looks like Google took the Avegant and AMAT development kit and then modified it slightly to fit their needs.
I have taken information from two slides with some specs presented by Avegant at AR/VR/MR 2025 below. As I will caution in the appendix, you can’t objectively directly compare the Nits/WattLED numbers between Avegant and Lumus shown earlier. Lumus’s Z30 has ~2.25x the FOV area (30/20-squared) plus many other factors. What you can see is that the Nits/WattLED falls off dramatically with brightness. For 600Nits/23mWLED = 25,000 Nits/WattLED, whereas 3000 Nits/226mW = 11,272 Nits/WattLED or less than half the efficiency.

These efficiency numbers on the surface seem very good/high for a diffractive waveguide, even with the comparatively small FOV. Avegant makes a point that they have made significant efforts to make their optical engine very efficient.
Avegant has devised clever ways to reduce the size of its LCOS projector engines. It seems that every year at AR/VR/MR, Avegant presents an ever-smaller engine. The optical path (taken from Avegant’s website) from their newer projectors is shown below. A key trick is how they use a dichroic combiner/”waveguide” to route light down the projector lens to avoid needing a polarizing beam splitter in traditional LCOS projector engines.

We are seeing many product concepts in AR glasses, such as Meta’s Orion, developer devices such as Snap Spectacles 5, and products from many companies.
Meta’s Hypernova appears to be a high-end, limited-volume consumer product to test the market. Based on how Hypernova is received, Meta could either focus on driving volume to reduce cost or focus on the (rumored) biocular Hypernova 2. Meta is highly unlikely to bring Orion’s Silicon Carbide to the mainstream market in the next several years. Still, Meta is clearly serious about AR glasses and plans to spend about $20B this year on AR and VR, with the bulk of the investment going into AR.
Google’s intention for its AR glasses is much less clear. Google has developed a reputation for entering and exiting markets, particularly when it comes to hardware. It’s not clear whether Google will build a hardware product itself or leave it to one of its Android XR partners; in this case, the Google AR glasses shown in the TED Talk might be a reference design for their partners.
One thing is for sure: the glasses form factor space is heating up, with the giants Meta and Google starting to show products that are likely to be sold to consumers in the next six months to a year. Rumors of Apple entering the Optical See-Through AR fray are also starting to heat up again.
LCOS still has major physics advantages over MicroLEDs with waveguides when displaying full-color images, such as those from camera viewfinders, photographs, or web browsing. This is particularly true when trying to achieve the 1,000 to 3,000 nits to the eye necessary for practical use in daylight. While MicroLEDs seem to garner most of the attention, and while there are drawbacks to LCOS due to field sequential color breakup and less contrast, LCOS is likely to be the most practical display technology for full-color or higher resolution for at least the next few years and perhaps for many years.
One thing I don’t understand is why Lumus and Avegant are still independent companies, given the numerous acquisitions of companies in the AR space over the last ten years. I have seen massive spending and acquisitions in the area of MicroLED displays by Meta, Google, and Apple. Yet, almost nothing has happened beyond Snap buying the struggling Compound Photonics in LCOS.
Both Lumus and Avegant have given Nits/WattLED specs for their projector and waveguide combinations. I want to caution everyone that while the numbers give some idea of the efficiency of the projector and optics, there are so many other variables that they are likely not comparable.
While “nits/WattLED” may sound like an objective number, many factors make it impossible to compare numbers that sound the same from different companies. Unfortunately, every company measures differently. These factors include:
I would not say the numbers are meaningless, and I am still happy that some companies report their Nits/WattLED. It at least gives some idea as to whether the AR glasses might work outdoors. The values are probably more comparable within products from the same company than they are between two very different designs.
Wow, this is a really great article! But when it comes to these waveguides, the price is a big factor, and the FOV is usually so limited and small.
That’s why display glasses like Xreal One Pro with 57° still have a strong following. Plus, the whole one-eye thing is definitely a turn-off for a lot of typical customers.
But hey, who knows? Let’s see how things play out for these companies.
I’ve seen enough smoke and tea leaves to suggest Google is using Vuzix waveguides and not Applied Materials for their glasses, but time will tell. Great article as per usual, Karl.
Thanks Mike, I saw that Vuzix has announced a couple of “specialty” contracts for hundreds of thousands of dollars but both are for specialty products. The most recent one for “The thermal smart glasses” I think is by ThermalGlass (ThermalGlass.com) which I met with at CES 2023 and discussed with Brad Lynch on YouTube (https://www.youtube.com/watch?v=LgcLmXycr3k&t=4939s). They have also announced deals with Augmex that is more in the enterprise solution space.
The other interesting development with Vuzix is their purchase of a Silicon Valley R&D waveguide facility, likely from a big company. I guess this could be a form of “buy the facility, run it, and sell back to us deal” from a big company that decided they didn’t want to manufacture waveguides.
The reasons I don’t think it is Vuzix is A) Axel Wong’s Chinese source said it was AMAT and/or a Chinese company, B) The eye-glow from the waveguides in the Google Glass TED talk don’t look like Vuzix. Vuzix has been working to reduce the eye glow. Their only full color waveguides to date are use DLP and are bigger and bulkier engines. And as my article points out, the projector engine looks at lot like Avegant’s.
I know people know who’s facility Vuzix bought (how many R&D waveguide facilities can there be in Milpitas?), but they are not talking. I hope to find out this week at Display Week.
Karl
I heard the Milpitas facility could have been Magic Leaps, but who knows.
Great article as always Karl!
Agree that it is a bit awkward that Lumus and Avegant have not been acquired yet. What puzzles me even more is that Meta, spending 10s of Billions per year on its XR program, seemingly relies on these small players for core XR HW components to realise a commercial product. Makes me wonder where all that money goes to.
Dan, it’s going to research and not even the research you think. It’s great that we have Karl to break down the components and prototypes etc. But there is another level of research related to manufacturing that really isn’t reported on. The companies I have all worked all have your traditional “R&D” budgets to figure out processes to build XYZ prototype. But then you have to research & build literally the machines & scaled procedures that will help manufacturer at scale the crucial widgets and assembly processes to churn out 100’s of thousands of your now XYZ product. And a lot of times that can be a barrier.
That’s the great article i found and i find a more information from your article.
It will help me to improve doing my work tasks.
Thanks.
Hi Karl,
I saw this and thought of you:
“One of the main hurdles to the commercialization of AR glasses has been the waveguide. In AR optics, the lens itself also serves as a “highway of light,” guiding virtual images directly to the user’s eye. Due to chromatic dispersion, conventional designs have required separate waveguide layers for red, green, and blue light—three to six stacked glass sheets—inevitably increasing both weight and thickness.
Professor Junsuk Rho and colleagues at POSTECH have eliminated the need for multiple layers by developing an achromatic metagrating that handles all colors in a single glass layer. The key is an array of nanoscale silicon-nitride (Si3N4) pillars whose geometry was finely tuned by a stochastic topology-optimization algorithm to steer light with maximum efficiency.”
linky: https://phys.org/news/2025-05-glass-full-millimeter-waveguide-augmented.html
Thanks for all the interesting content 🙂
I think it’s just about worth noting that – as you detailed extensively before – Meta’s silicon carbide waveguides don’t actually enable a 70 degree (per engine) FOV like Lumus’ reflective ones do. Meta Orion used SiC waveguides, and it *also* did a 70 degree projection, by overlapping projection regions from two engines per eye. You’d know better than I would how many extra degrees the higher RI adds, and whether that margin was necessary to accomplish what Orion did. The overall impression I got from your coverage and others was that the SiC was in the end more about reducing the artifacts and drawbacks of diffractive waveguides.
“I wanted to say that I truly appreciate the consistent quality and thoughtful content you put out. It’s refreshing to find a blog that’s so reliable and informative. You’ve built a great resource here.” ailinkboost
[…] Learn more […]
Karl- Thanks for your article. Does Himax LCOS play any role in the new glasses?
Its a good question and I will be looking to find out why.
Himax is still out there but I mostly see new designs using Raontech’s LCOS. I don’t know the reason why as they originally were the go-to LCOS for the Hololens 1 and Google glass. I occasionally see Omnivision LCOS in designs, but once again, mostly Raontech LCOS in recent years.
I also see in a few designs Citizen Fine Device which makes FLCOS (ferroelectric or “fast” LCOS). FLCOS is mostly used in designs that need a higher switching speed. Creal’s light field takes advantage of FLCOS.
Additionally, while I see Himax discussing their Front-Lit at conferences, I have not seen it in any products or even prototype glasses. Himax’s basic Front-Lit concept goes back at least as far as 2014 or over 11 years. I’m sure they have improved the Front-Lit concept, most likely the efficiency and uniformity, but I still haven’t seen it make its way into products I have seen. The main thing Front-Lit does is eliminate the beam splitting prism of a classical LCOS design. Other companies such as in the Magic Leap 2 and Avegant (see: https://kguttag.com/2022/01/31/magic-leap-2-at-spie-ar-vr-mr-2022/) have found other ways around needing a beam splitter.
[…] qui est sûrement un « design de référence » pour les futures modèles comme l’avance Karl Guttag dans son analyse technique. Elles sont présentées comme un co-développement avec Samsung, mais semblent très peu […]
Google Translate from French: “which is surely a “reference design” for future models as Karl Guttag suggests in his technical analysis. They are presented as a co-development with Samsung, but seem very little […]”
Hi Karl, great article. Do you have a newsletter?
No, just the blog and occasional YouTube videos.
With that off-axis, reduced FOV, monocular format, I wonder if after all is said and done, they are using the old “Focals by North” scheme. Hard to say without a see-through image with some bright off-axis light sources (shows holgram rainbows or edges of reflective waveguide elements).
Meta CTO Bosworth stated on Adam Savages Tested that the glasses use a “Geometric” (=reflective=Lumus) waveguide and an LCOS microdisplay.
Yes just seen the interview with Chris Cox (Chief Product Officer) on Bloomberg Tech referring to the reflective waveguide, so that solves that one. Interesting interview, should be on Youtube soon 😉