Meta Ray-Ban AR Glasses Show Lumus Waveguide Structures in Leaked Video

Introduction

Just a short note today. I’m in the middle of my trip, which will end in Eindhoven next week (see notice at the end of this article), when I heard that Meta had leaked a video on YouTube of its Monocular AR glasses, which are expected to be disclosed at Meta Connect tomorrow, September 17th at 5 PM PST, at Mark Zuckerberg’s Keynote. Although the original leaked video on YouTube has been removed, multiple copies were captured before it was taken down.

This blog reported on May 11, 2025, that it was likely that Meta was using a Lumus waveguide for their upcoming AR glasses product codenamed Hypernova (see: Meta’s Hypernova Optics – Likely a Lumus Z-Lens Variant). While the codename for the AR glasses was Hypernova, the product is expected to be named Celest. Bloomberg has reported that the Celest AR glasses are supposed to cost about $800 for the base model without prescription correction.

Evidence of Lumus Z-Like Waveguide in the Video

Below left is a still frame I captured from the video, which shows the diagonal “slats” that form part of Lumus 2-D pupil expansion. This video appears to be a rotating 3-D model, so I was a bit surprised that they showed the slate details.

The 3-D model makes them appear much more visible than they appear in real life. On May 21, 2025, I identified similar pupil expansion slats in Rivet’s AR glasses in “Exclusive: Rivet Industries Using Lumus Waveguides for Military & Industrial AR.” The images above, middle and far right, compare the Rivet AR glasses to a Lumus Z-Lens waveguide. There is clear evidence that Lumus is being utilized in both Meta’s first AR consumer product and Rivet’s military AR glasses.

In Closing: Be Looking for the Lumus Waveguide in Mark Zuckerberg’s Meta Connect Keynote

I’m likely to be able to see when Zuckerberg gives his live presentation on the 17th. It is assumed that he will showcase the Celest glasses. Hopefully, you will be able to spot the Lumus waveguides when he puts the glasses on.

MicroLED-AR/VR Connect In Eindhoven, Netherlands, September 23-25

I’ve been asked to speak at and partner with the joint MicroLED and AR/VR Connect conference and exhibition in Eindhoven, the Netherlands, on September 23-25, 2025. The Conference operates like a single conference with two tracks, so attendees attend presentations from both. The first day, September 23, is Master Classes and Tours of some Local Labs. The conference proper is on September 24-25th. I’m speaking at 5:20 PM on the 24th.

The Conference is offering my readers a €150 discount if they use the discount code KarlARVRThis discount code is valid for both the “Virtual Pass” and “Hybrid Pass” (see the tables below) for virtual and/or physical conferences, which can be attended remotely (this blog receives remuneration for the use of this code). 

MicroLED and AR/VR Connect have a 12-month pass program that offers a “Virtual Pass,” which allows attendees unlimited 12-month access to recorded presentations at both the virtual (March) and physical (September) conferences, as well as other materials outlined below. The “Hybrid Pass” includes everything from the Virtual Pass, plus attending the physical Conference in person. The lists below outline the two passes in more detail:

MicroLED and AR/VR Connect Hybrid Annual Pass

  • Admission to the full Conference and exhibition MicroLED Connect event in Eindhoven (Netherlands), including food & beverages
  • On-site drinks & networking receptions 
  • All online MicroLED Connect events or online versions of onsite events for 12 months
  • Entire online masterclass portfolio for 12 months (optional)
  • Year-round platform, including networking and business development tools, for 12 months
  • A growing library of microLED Connect on-demand library of content for 12 months

Virtual Annual Pass

  • All online MicroLED Connect events or online versions of onsite events
  • Entire online masterclass portfolio
  • Year-round platform, including networking and business development tools
  • A growing library of on-demand content
  • Upgrade to an on-site/hybrid pass, offsetting the remaining credit 

Hopefully, I will have the opportunity to meet more readers in Eindhoven in September (please send meeting requests to meet@kgontech.com).

Karl Guttag
Karl Guttag
Articles: 297

18 Comments

  1. If they use Lumus rather than any internal diffractive waveguide, does it mean that they did not have a plan B for diffractive waveguide beyond SiC and that SiC yields push it out of any sub-1000 dollar product?

    • That is an excellent question. I’m sure they have both internal and external glass-based diffractive waveguide options. From what I can find, their SiC waveguide color uniformity is terrible, not to mention the cost. It was quite a feat for Lumus to be able to overcome the not-invented-here (NIH) that goes with $1.5B/month that Meta is spending on AR. They had to “win” the design based on the technical advantages.

      Lumus appears to have major advantage when you combine, efficiency (Lumus claims about 5 to 10x more efficient than diffractive), image quality, front projection (eye glow), and far field light capture (Meta calls Rainbow capture) overs diffractive waveguides. Often diffractive waveguides will improve some of these issues but at the expense of others (look for example the eye glow from Google’s XR glasses).

      Meta has made it clear they want to go to wider FOV using SiC, but Lumus already has the ability to go beyond 70 degrees FoV using lower (than diffractive waveguide) glass, no less SiC.

      The claimed issue for Lumus has been whether their technology can be made in volume. This will be way to prove it one way or the other.

  2. Any idea why Googles prototype smart glasses with a display are so much less bulky than Meta’s? I just looked at them again side by side with pictures of the new ray bans and the difference in thickness/bulk is very noticeable.

    I’m assuming it’s because Google used MicroLED vs LCOS for a display engine allowing them to shrink the projector, but why do you think meta went with LCOS when companies like Ray Neo already have glasses on the market that use full color microLED display engines on the market and are noticeably less bulky?

    I was seriously going to buy a pair if they looked like the glasses Google demoed but these are a bit too bulky for my taste and now I am seriously considering waiting a few more generations until they can make them quite a bit sleeker, however long that takes..

  3. I agree with your assessment regarding the LUMUS waveguide.

    In that case, which company manufactures the LCoS? Is it HIMAX, OMNIVISION, RAONTECH ?

    • I don’t know, but it most likely RaonTech as most of the recent prototypes I have seen from Lumus use Raontech. But Meta could have specified a different LCOS device.

  4. According to Mark Gurman, the display specs are 20 degree field of view monocular display, 600×600 resolution, up to 5000 nits of brightness.

    That doesn’t sound like anything Lumus currently offers, atleast the FOV doesn’t match. Also, there is an article stating GoerOptics are manufacturing the glasses for Meta. Last I checked, Schott and Quanta are the manufacturing partners for Lumus.

    • You should consider that Meta is big enough to get a custom version.

      Meta CTO on Adam Savage’s Tested said they are using a “Geometric” waveguide which is another way of saying reflective. I have only heard Lumus call them Geometric.

      I don’t think Goertek has a geometric/reflective waveguide and certainly not one that looks like a Lumus Z-Lens. I believe Goertek assembles headsets with subsystems from other companies and not just their own engines. I also don’t know if the Meta/Ray Ban Display glasses are assembled by Goertek (they might, but I don’t know).

  5. Lots of Meta people congratulating themselves today, but the reality is that the return on investment is utterly obscene. If this effort had been run competently in the first place, they could have released this years ago.

    Instead, the whole thing was (still is?) infested with empire builders and chancers who were more concerned with growing headcount and promoting their technically-illiterate favourites than anything else.

    If they’d instead invested what they spent in just one month into a few startups, they’d have ended up with something far better than this.

  6. Why are we calling them AR glasses, though? There’s no augmented reality there, doesn’t even have the necessary sensors built-in. Even Meta calls them “AI glasses”.

    • There is no solid definition of what constitutes “Augmented Reality.” Those with SLAM want it to include SLAM. Those without SLAM want the broader term. We are drifting to using Smart Glasses, but then people would include those without a display.

      • It is possible that it could be Omnivision. I just saw a cutaway view of the projector engine and it is very different from what Lumus has done before.

    • Google ARCore uses mono cameras to track the position in the room. When you have the video screen which Meta promissed to deliver to developers soon you can implement a SLAM tracking solution yourself. Developer acess to the display is not yet confirmed but if developer get that you could offer AR solutions. I guess the display is not covering enough of the field of view to do much with it but for some AR apps that might be enough to become usefull.
      Maybe you want to repare a device and you describe the problem, hold the device in a way the display covers the device you want to repair and shows you the location there the faulty part is as one example I could think of shoult be possible with the current hardware.

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading