304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
First, I plan to go to CES 2023, January 5th through the 8th. If you have business interests or would just like to meet to discuss technology, email me at email@example.com. The setup for AR is going to be very different than it was before the pandemic. Unfortunately, South Hall will not have a dedicated AR area like in previous years. It will certainly make it tougher not to have a special section, and I expect there will be fewer AR companies. I’ve been told that AR companies are likelier to be at the Venetian (Sands) Expo.
When I first heard Niantic had an “Outdoor Reference Design,” I thought it might be Lumus, but I had no evidence. It was obvious Niantic was using a waveguide, and Lumus has about the only waveguide technology bright enough for reasonable outdoor use without very dark sunglasses. Lumus has been used in outdoor military applications for about a decade.
Then I saw some speculation on the Reddit group r/AR_MR_XR (right) that Lumus might be in the Niantic design, which sparked me to figure out if I could prove it. I went Frame by Frame through Niantic’s Video to catch a few frames without the sunglass cover. Then I found a video of Niantic at Snapdragon Summit 2022 with a better view of the headset prototype.
In May 2021, I was given exclusive access to the Lumus Maximus headset prototype. I took many pictures of the headset both from the outside and through the optics, some of which were used in Exclusive: Lumus Maximus 2K x 2K Per Eye, >3000 Nits, 50° FOV with Through-the-Optics Pictures. I ended up taking many photos of the Maximus headset at different angles that I could match up with the still frames from the Niantic video. The Maximus prototype was significantly better than any AR headset I have seen before or since in terms of brightness, efficiency, and image quality.
Whenever I see a new AR headset, I’m always curious about its type of optics. Most companies recently are using birdbath optics, but thinner, more glasses-like designs use waveguides. All waveguides may look similar to the untrained eye, but every waveguide design has differences in where it injects the light/image and how the light spreads through the waveguide on its way to the eye. Even among diffractive waveguides, each company’s waveguide has different “signatures.” They differ in the amount of light in the front projection (“eye glow”) and the diffraction grating patterns as light hits off them. Below are diffractive waveguides from WaveOptics (bought by Snap), Vuzix, Dispelix, and DigiLens.
Normally, diffractive waveguides inject the light nearly perpendicular to the waveguide into an entrance grating that causes the light to diffract at an angle to cause total internal reflection (TIR) within the waveguide. The projector and its optics are more or less directly behind the entrance grating (WaveOptics often has the projector parallel to the waveguide but with a 90-degree fold mirror). Below is a typical diffractive waveguide taken from a Magic Leap patent application.
Lumus waveguides use reflection to direct the light rather than diffraction. They use an injection prism cut at an angle to get the light to enter the waveguide at the angle necessary for TIR. The projector tends to be an angle that is not perpendicular to the waveguide. The angle of optics entering the waveguide is one of the major tells in the Niantic design (more later).
The image with the most detail I could find turned out to be in the background of Maryam Sabour, GM and Head of Business AR headsets at Niantic, speaking at Snapdragon Summit 2022. Behind the speaker (right) was the best image I have found (likely a 3-D model) of the prototype without the sunglasses cover.
A crop of the headset from the Snapdragon video is compared to a picture of the Lumus Maximus below. In this first comparison, you can see macro features like the size and shape of the projector module, electronics, and dual cables. Also, note the black rectangles pointed at by the small red arrows in the corner of each waveguide; this is where the projector module meets the inside of the waveguide.
Looking more closely at a single lens, you can see more detail. In particular, note how the case around the projector is at an angle due to the way the Maximus injects light into the waveguide at an angle. Circled in dotted lines is a light blocker where the light enters the waveguide. The Lumus Maximus prototype has an additional outside cover (right below pointed at by an arrow), not on the Niantic prototype.
The viewing angles of the Niantic and Maximus prototypes are slightly different. Additionally, Niantic used a headband, whereas the Maximus used a glasses form factor. Still, the projector and electronics modules look to be very similar.
The next comparison will be from a still frame (right) in Niantic’s Lightship x Snapdragon Spaces Video. Most people in the video wear sunglass shields which hide the design details, but one person does not have them on. Even without the sunglasses, there is a lot of glare, making it hard to see through the waveguide, so only in one or two of the still frames can you barely make out the engine through the waveguide.
The video states that they are showing “actual headset gameplay.” This means the person without the sunglasses has a bright enough display that they can see in bright daylight.
I also never saw any front projection (“eye glow”) common with diffractive waveguides anywhere in the video. Lumus’s reflective waveguides inherently project only a very small amount of light forward and then only over a small angle, compared to most diffractive waveguides. Eye glow is a major issue for distraction and social acceptability. As has been noted with the Hololens 2’s use in the military, the front projection will give away a person’s position:
“Criticisms, according to the employee who dictated to Insider excerpts of this report, included that the device’s glow from the display was visible from hundreds of meters away, which could give away the position of the wearer,” Business Insider, October 2022
I had a picture I had taken in May 2021 of Dr. Aviv Frommer, Executive VP of R&D at Lumus, wearing the Maximus (left) at a similar angle as the Niantic prototype in the video (center). I then cut out the Maximus Glasses and overlayed it on the Niantic prototype. Because of the sunlight glare on the waveguides, it is harder to see the projector module than in the picture taken from the Niantic Snapdragon Video (shown earlier), but it is there if you look carefully (see red arrows).
In the GIF below, the Lumus Maximus is alternately overlayed and removed so you can see how well they match, particularly at the projector locations (pointed at by the red arrows).
This section is a rewriting of a section I wrote in an article about Magic Leap 2’s dimming feature.
The table on the right was created from various sources, and the amount of real-world light in nits (cd/m²) reached the eye in various conditions. The human eye can see over a very wide range of brightness. Typically in well-lit rooms, the things you look at are in the 20 to 150 nits range. Outdoors, much of what you see lit by sunlight, is between 500 and 10,000 nits. At night or in a dimly lit room, a person with time to adapt their eyes can easily see things at less than 0.1 nits.
Simple contrast is the ratio of the lightest to the darkest. When using AR glasses, the light of the real world adds to both the dark and the lightest. With most glasses, the real world’s brightness (I_world) is dimmed by the transmissivity of the glasses.
I_back = I_world X transmissivity,
Where I_black is the net light from the real world at a given pixel area that reaches the eye. And the contrast in AR is given by:
contrast = (I_display + I_black) / I_black
At about 1.5:1 contrast, the text is barely readable in an image. At 2:1, the text is more readable, but the colors are extremely washed out. At 8:1, colors become moderately saturated (important if you want your Pokémon Go Pikachu’s to be Yellow). To watch a movie, one would like more than 100:1. There are two ways to improve contrast, 1) brighten the display and 2) reduce the background (with passive or active dimming). Flat panels like monitors and smartphones have screens with light-absorbing characteristics to greatly reduce ambient light, but they are not trying to mix the real world and a virtual world optically.
For example, assuming you are outdoors looking at green grass in the sun with about 2,500 nits and want a decent image with color (8:1 contrast). You would want a 2,500-nit display and then dim (sunshade) the real world by about 4x.
With a 100-nit display, typical of most OLED birdbath designs, the sunglasses would have to block more than 99% of the real-world light to get 8:1 contrast. With 99% blocking sunglasses, it would be dangerous to walk outside as you could barely see anything.
These numbers show why using AR outdoors is such a challenge. Even with sunglasses/dimming, thousands of nits need to be from the display to the eye to support both the virtual image and the real world. A display of more than 3,000 nits, such as Lumus Maximus, would end with more reasonable amounts of dimming.
The brightness required for a practical display use also drives a need for a very efficient design. Typically, for the same field of view and eye box, the Lumus Maximus is about 3 to 7 times more efficient than diffractive waveguides and more than 30 times more efficient in terms of Watt-in versus Nits to the eye than OLED-based birdbaths. In an outdoor headset, efficiency is critical to brightness due to battery size, weight, and, more importantly, heat dissipation.
I want to be clear that “reference designs” come and go. They are a way of companies saying they are looking into something and want to give heads-up. Often reference designs are not complete and designed for production, and they don’t have to be affordable to consumers. They may signal what the company wants to see in a device. They may be fishing for hardware partners to help finish the design and make it production-worthy or software partners to develop applications. At the time of the publishing of this article, Niantic has not said why they are showing the Outdoor Reference Design.
As the evidence mounted, I contacted Niantic and Lumus. Niantic responded, “we’re not commenting on this at this time.” Lumus responded that they “could not comment on which specific companies they may or may not be working with. These are the responses I expected. If they wanted to announce some arrangement, they would have already. Rarely do larger companies give credit to the technology they use from smaller companies. About the best a small company can hope for is to push for and get a quote from the larger system company to use in a “joint” news release.
In electronics, ingredient Branding (or In-Branding) was made famous by Intel with the Intel Inside campaign in 1991. Intel held a strong enough position that they were able to require companies to put the “Intel Inside” on the case of PCs. Since that time, many component companies have gone for In-Branding.
Interestingly, even though I would argue that the Optics were more critical to the design, Niantic featured Qualcomm’s “Powered by Snapdragon” in their Outdoor AR Headset reference design announcement. But then, it is not about technology; it is about who has the business and marketing power.
I’m not being critical here. I have been at large Texas Instruments and startups selling components to both larger and smaller companies, and have dealt with this issue many times. I am giving my observations based on my experience. I would like to see small companies get more credit when it is due.
Through the 11 years of this blog, I have seen most of the AR display and optics technologies. Based on what I have seen, the Lumus Maximus has significant advantages in terms of brightness, efficiency, size, weight, and image quality for outdoor use. I don’t know about cost, but Lumus claims they are cost-competitive with diffractive waveguides.
I think it is pure folly to use a display technology with less than 2,000 nits to the eye for outdoor use. There is no way to balance the dimming required, and this threshold eliminates most other solutions. At these brightness levels, nits to the eye per Watt become a major issue. Lumus is the only company I know of that has demonstrated more than 2,000-nits-to-the-eye-per-Watt for a greater than 50-degree image with greater than 85% transparency. Additionally, Lumus Maximus has extremely little “front projection” compared to most other technologies. If you think someone else can better meet these requirements, let me know, and I will publish the information.
Frankly, I was baffled as to why Snap reportedly paid $500M for WaveOptics. WaveOptics was one of the better diffractive waveguides, but they were one of many with diffractive waveguide technology. WaveOptics was nowhere near as bright and efficient, lacking of front-projection, and with the image quality of Lumus Maximus, not to mention that Lumus appears to better clear its I.P. space. For spending $500M, they should have bought a technology that was a clear leader.
While on the subject of Snap buying companies, not only did Snap buy WaveOptics, but Snap also bought the LCOS and MicroLED company Compound Photonics, news this blog broke on January 7th, 2022. The Lumus Maximus I tried back in 2021 had a Compound Photonics LCOS device inside. Fortunately for Lumus and Niantic, there are many other LCOS companies worldwide, including, among others, Raontech, Himax, Omnivision, and my old company Syndiant. While it will take some design changes, they are somewhat interchangeable.
As with WaveOptics, I don’t see where Snap got a major strategic advantage that justified buying a company. There are many LCOS, and for that matter, MicroLED companies. Snap seems to be buying companies in reaction to Meta (aka Facebook) and Apple buying up display and optics companies. With Meta, Apple, and Snap buying up companies, it does add extra effort into dealing with the potential need for redesign.
The blog has a history of being the first to identify a key display or optics technology being used inside a prototype or end product. Most headset makers use displays and optics developed by other companies and rarely acknowledge whose technology they are using. So it is a bit of a game to figure out what’s inside from the publicly available information.
Back in 2013, this blog identified that a Himax LCOS panel was used in Google Glass which caused Himax’s stock market cap to jump by about $213M in one day. More recently, in May 2021, this blog identified that Snap was using WaveOptics waveguides. In 2016, I wrote many articles on Magic Leap before it was introduced. By November 2016, I could accurately predict the display and optics technology nearly two years before the Magic Leap One was introduced.
Someone (who wants to remain anonymous) has lent me their Meta Quest Pro to analyze the display, optics, and AR passthrough quality. As you may have heard, the AR Passthrough on the Quest Pro is very bad. I will try and quantify with pictures how bad. It should be fun.
So you rarely ever comment on EMAN’s displays which many feel are the best and the brightes currently available. Why is that? Thanks
I primarily comment on things that are used in AR-type products or that I see demonstrated at shows related to AR.
The main use of OLED Microdisplays in AR is the many birdbath-based headsets such as Nreal and Lenovo. Most of them started out using Sony OLED Microdisplay, and the newer ones out of China appear to have switched to one or more Chinese designs.
If I find a product with eMagin in it or see them at a show, I will be happy to report on them.
[…] More info […]
[…] More info […]
Thanks for writing so many excellent articles on AR/VR. I have read your articles comparing Lumus reflective waveguide and diffractive waveguides. Looks like Lumus reflective is better than diffractive in any measurable way. Lumus waveguide has been out there many years. How come most of the major tech companies choose diffractive other than Lumus reflective? One thing I can think of is their mfg processes. Diffractive is compatible with semiconductor mfg processes that can be scaled up to make millions of parts for consumers, while Lumus uses non-conventional, seemingly more difficult approach based on what is shown on their website. What is your opinion on this? Thanks.
Many companies can make diffractive waveguides, and there are many types of gratings and variations within those types. As you wrote, the image quality, particularly the color uniformity, has not been good with diffractive waveguides compared to Lumus’s reflective waveguides. Lumus has been used in many headsets, particularly at the higher end, such as military applications.
Also, as you wrote, it is commonly thought that diffractive waveguides would be less expensive to make, but there is not much evidence that this is true. Diffractive waveguides have their manufacturing problems as well. Typically two or three diffractive waveguides are required for full color. Additionally, you must allow that even a “good” device has poor color uniformity. Each waveguide has to have a diffraction grating form on it by either a photographic process, photolithography, or micro-printing. The waveguides must be perfectly aligned and glued other with air gaps. I do not see in the market diffractive AR headsets that are “low priced,” suggesting that the waveguides are not inexpensive to make.
In some ways, Lumus’s approach is simpler. They simply coat layers of glass with a uniform coating, glue the layers of glass together into a sandwich, and then cut the sandwich on a diagonal. Conceptually, it could be argued that the Lumus method is simpler to make. At last week’s AR/VR/MR show, I asked a representative of Schott, one of the world’s leaders in optical glass and the manufacturer of the new Lumus 2D waveguides, about the manufacturability and they said that the Lumus waveguides were manufacturable.
So I would say the jury is out on whether Diffractive or Reflective waveguides are less expensive. Nobody has proven their case with products yet.
Thanks Karl for these excellent articles on AR, would you please recommend some VR related blogs that is as insightful as yours?
I’m not particularly familiar with VR blogs. The only one I am familiar with is SadlyItsBradley YouTube channel (https://www.youtube.com/@SadlyItsBradley) and its related https://sadlyinreality.com/ blog.
excellent analysis！thank you.
In your opinion, will MicroLed replace LCOS in AR?
I don’t see MicroLEDs being competitive with LCOS on resolution and image quality for many years. MicroLEDs are struggling to support color and the pixel-to-pixel uniformity is poor (very “grainy” images).
At least in theory, MicroLED holds most of the long terms advantages, but it could take more than 10 years to become resolution, image quality, and cost competitive with LCOS. Currently, MicroLEDs are most advantaged and are finding their way into products with green-only “data snacking” applications with low amounts of content because dark pixels take almost no power.
Highly appreciated! Professional and insightful, best optics blog ever.