304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
Following up on Part 1 of the Nreal SDK and LG product teardown, in this part I’m going to show a lot of pictures of the internal components. In some cases, I’m literally going to peel the optics apart layer by layer.
I’m going to start by showing the various components inside the Nreal. I’m then going to go through the optics diagram shown in part one and discuss the efficiency/throughput of the light from the display and the real world.
Nreal does some interesting tricks using quarter waveplates (QWPs) and polarized light to improve efficiency and reduce unwanted effects. Going through all the details is instructive in understanding the limitations of the birdbath design that is finding it way into many announced products.
As I wrote last time, this series is not meant to be so much about Nreal itself, but the overall design of birdbath optics and the design and performance trade-off associated with this type of design. One value of doing a detailed analysis is that you will better understand the trade-offs in similar designs.
The Nreal SDK model and the LG-Product use Sonys’ ECX335 Micro-OLED with 1920 by 1080 pixels and a 0.71″ display-diagonal. The two designs look to be nearly identical.
This was a little surprising since Nreal stated back in June 2020 that their production devices would be using the BOE’s 1080p Micro-OLED. BOE claims to have a 1080p Micro OLED with the same 0.71” diagonal as the Sony ECX335. For some reason, these products shipping in lat 2020 were still using the Sony device. Qualcomm, in their reference design announcement, for a similar type of AR heaset stated, “BOE provides a micro-OLED binocular display.”
It seems some companies are hoping to switch to BOE, presumably for lower cost, but it did not happen before Nreal went into production.
Sony dominates the current market for OLED microdisplays. Sony’s OLEDs are in many, if not most, cameras with electronic viewfinders (EVFs). EFVs are currently a much larger market for OLED microdisplays (>4 MU per year) than AR headsets. Even Sony’s camera rivals, Canon (see right) and Nikon, appear to use Sony’s Micro-OLEDs.
Least we forget the elephant in the AR room, Apple. There have been multiple rumors that Apple plans to use a Sony Micro-OLED Microdisplay in a Mixed Reality headset. At least one analyst, Ross Young, identified the panel as a high brightness variant of the Sony 1280×960 ECX337.
A word of warning, there are many conflicting rumors as to Apple’s AR and VR plans. Some rumors have Apple developing their own OLED Microdisplay. Others think Apple will develop a lightweight “passthrough AR” (VR display with cameras to passthrough video).
To be clear, I highly doubt that Apple will be using anything like Nreal-like birdbath optics with Micro-OLEDs. While a cost-effective design with reasonably good image quality, I don’t think it is sleek enough for Apple. At the same time, as I discussed in a three-part series (Part 1, Part 2, and Part 3) last year, Micro-OLEDs are not bright enough to support most thin waveguides.
I will be using a mixture of pictures taken with the Nreal SDK (red frame) and the Nreal-LG product (dark gray frame) to show the Nreal glasses’ assembly.
Starting from the Sony 1080p Micro-OLED, a plastic absorptive polarizer is glued/taped directly on the OLED’s glass cover.
An ~13mm focal length glass lens is glued on top of the polarizer. The glass lens for one eye weighs 6 to 7 grams, and the two lenses combined contribute about ~13 grams or ~15% of the total 88-gram weight of the Nreal glasses. The lenses combined weigh the same as the magnesium alloy front frame. As will be seen in pictures later, the lens causes barrel distortion of the image, to be corrected by the curved mirror in the birdbath.
The image on the right shows the OLED with polarizer and lens glued to it (left), the Sony Micro-OLED with them removed, and then (right) the separated lens and polarizer. A side view of the whole assembly is shown in the bottom right corner.
The pre-polarizer on the OLED may seem redundant with the beam splitter as the beam splitter will polarize the light. But the polarizer on the OLED is blocking light that will never make it to the eye from entering the birdbath cavity that can only cause problematic reflections (double images and contrast loss).
The pictures on the left show the birdbath after it has been broken apart. First, there is a polarizing beam splitter. It consists of 3 plastic films (more on the films in a bit) glued on top of a thin (~0.75mm) piece of glass.
Next is a QWP plastic film sandwiched against the curved mirror, which is 40% reflective / 60% transmissive partial mirror. The mirror is made of ~1.4mm of plastic and has a coating deposited on the outer curve surface. Because light passes through the plastic before reflecting, it forms what is known as a Mangin mirror that could correct for spherical aberrations. Based on some pictures to be shown later, this mirror also appears to correct the distortion caused by the 13mm lens on the OLED.
The beam splitter is held at ~45 degrees relative to the OLED and partial mirror by a thin plastic frame. As will be seen in pictures later, the beam splitter is also slightly tilted horizontally.
An otherwise flat QWP is sandwiched against the curve mirror and held in with just two tabs on the left and right sides. Between the tabs and being sandwiched, the thin QWP takes on the curvature of the mirror somewhat.
The pictures on the right show the QWP assembly in the frame with the beam splitter removed.
My first thought is that Nreal might be using a Moxtek wire grid polarizing beam splitter. The Moxtek beam splitters have very high quality and favor the high contrast when blocking polarized light. But Moxtek beam splitters tend to be expensive and relatively fragile.
When dissecting the beam splitter, I found it had at least one film. It turns out Asahi Kasei makes a wire grid polarizing film used in some of ODG’s birdbaths. But then I discovered there was not-one, but three plastic films laminated onto the glass substrate.
The top two films appear to be two different dichroic polarizing mirror films. Each film reflects better in different parts of the spectrum. The bottom (3rd) film is an absorptive polarizer.
In the top right picture, I have stripped off the top two films on the right-hand side and taped them partially overlapping to the beamsplitter’s glass substrate. You can then see the combined effect of the normal beam splitter on the far left side and the two films individually on the right sign. In the top picture, the beamsplitter perpendicular to polarized light shining from beneath the assembly. Note how the two films pass different colors (magenta and greenish), suggesting a dichroic film.
In the bottom right picture, the beam splitter about 45 degrees as it would be in the birdbath. Dichroic films are very angular sensitive, and now both films reflect most of the light. You can see that it is not as dark as it is with all three layers including the absorptive polarizer. Some of the light is not being reflected and thus lost.
The beamsplitter has many surfaces to it, and each surface causes some light loss. The top two dichroic polarizing mirror films are slightly “sloppy” and reflect some light polarized the “wrong” way and block some “correctly” polarized light when they should pass it. The absorptive polarizer acts as a cleanup to keep that light from exiting downward and causing some unwanted double image reflections in the glass substrate after it.
The three plastic films are mounted on a very thin piece of glass that is near the eye. Because the plastic films are taped/glued to the glass, this should provide at least a modicum of shatter prevention. It was pointed out to me that others might choose a thicker plastic material to meet safety standards.
Below is a picture of the Nreal frame made out of what appears to be a magnesium alloy. The frame weighs about 13 grams, the same as the two glass lenses combined. Note that there are brush-like thermal conduction pads. The Sony OLED has about 1-Watt of power going to it at full brightness for an all-white image. The Nreal headset is only passively cool with no air holes, so the front of the frame is the only significant mechanism for dissipating heat.
The front has holds two absorptive polarizers with a QWP film laminated on it. In the bottom right picture below, I removed about half the film for evaluation. It turns out this QWP film counteracts the QWP film inside the curved mirror in such as way that the light that passes through the front polarizer will also pass through the beam splitter. In this way, there is no additional polarization loss of the real-world light.
The outer cover made of two thin pieces of laminated plastic.
The sequence three (3) pictures below show how the image changes in shape and brightness as it moves through the birdbath. I selectively removed some components to give views for comparison. For all 3 pictures, the camera exposure time (only) was changed to get the same brightness of the left-hand display image by changing the shutter speed. Thus the ratio of shutter speeds indicates the relative brightness of each image. This method is very rough, with about 30% error due to camera settings being binary logarithmic.
The first picture (below) shows the view from underneath, looking at the OLED with the polarizer and glass lens on it with the beam splitter removed on the left side to give a direct view of the OLED with the polarizer and glass lens. There is some bulging (barrel distortion) caused by the glass lens. You can also see a little bit of the image in the mirror from this view.
At this point, the light from the OLED has been reduced by about 50% of the output of the OLED, mostly due to the polarizer. I should note that the light is not fully polarized at this point, probably due to the very wide angle (Lambertian) rays going through the polarizer on top of the OLED.
Next, we have a view from the front. In this case, I have removed the front polarizer on the left side (left above) to see the output after the mirror. The image on the front mirror and the lens on the OLED do not line up perfectly, so the beam splitter must be slightly tilted. The image still has the same shape (slight barrel distortion) passing through the mirror.
Without the front polarizer, the image is more than twice as bright as the image that will go to the eye. And it is about 60% as bright as before due primarily to the 60% partial mirror. About 4% of the light at this point is not polarized, and so about 5 nits will make it past the front polarizer (right side with front polarizer).
And in the 3rd view below, we have the view to the eye. The curve mirror has corrected the barrel distortion of the lens. The image again has straight edges. This seems to suggest that the mirror has some correction designed into it. About 120 nits or about 14% of the OLED’s nits make it to the eye.
Also, note above that you can see some of the glass lens. It is lit up slightly by the light going through it. I’m told a slightly more complex design of two lenses not be seen.
While it might be hard to tell in the small picture to the eye above, the top of right side of the image is slightly smaller than the bottom. I suspect this is cause by the slight tilt of the beam splitter.
As I wrote last time, the inherent amount of front projection has me wondering what a functioning Lenovo Think Reality A3 will look like without a front polarizer (stills below from video). It looks like these Lenovo videos were filmed with dummy (or off) headsets, and the real product will have darkening and/or polarizing covers.
For reference from the last article. Below is what Nreal looks like with and without the front polarizer (but in a less well-lit room).
The diagram below was shown in part 1. I have added some more detail in the figure and description. This diagram will also be used when describing the various optical losses.
The light path from the display to the eye (in blue) and numbered 1 to 8. The green path (labeled A to F) shows real-world light. “Undesirable” paths including “front projection,” “down projection,” and “upward reflection” are also shown.
Following the blue path and numbers above: Unpolarized light from the OLED passes through a (pre) – polarizer (1) and then through a single glass lens with about a 13mm focal length (2. The light is then reflected off a polarizing beam splitter (3). The beam splitter consists of two layers of dichroic polarizing mirror films on top of an absorptive polarizer film on a thin (~0.75mm) glass substrate. Light from the beam splitter goes through (4) a thin plastic quarter waveplate (QWP). The plastic curved Mangin Mirror has an ~40% reflective coating (5) on the surface farthest from the eye. The mirror has a focal length of approximately 20mm but appears to be slightly aspherical. The reflected light then passes back through the quarter waveplate (6). After two passes through the quarter waveplate, the polarized light is rotated by 90 degrees and will pass through the beamsplitter films (7) and glass (8) to the eye.
Light from the real world (in green) goes through two layers of clear cover (A) and an absorptive polarizing film (B) to polarize the incoming light in a direction that will pass through the beamsplitter. Ther is QWP film (C) applied on the polarizer (B). About 60% of the light passes through the curved partial mirror (D) and then through the QWP (E). The QWP (C) has the opposite retarding of QWP (E, 4 & 6 of the display path), and in effect, they cancel each other out. QWP (C) is necessary so that the real-world light will pass efficiently through the beamsplitter films (F) and glass (G) to the eye.
The two tables below show the rough estimations of the light transmission for each of the various components. As the Nreal design uses polarized light, the numbers below count the polarization loss (roughly 50%) on the first polarizer encountered in the light path so it is not double-counted. For example, the Pre-Polarizer (on the OLED) counts as 50% loss due to polarization plus some parasitic losses for a net of 48%. For other polarizing elements in the same path, only the parasitics are counted.
Even with “ideal” components, the loss would be 50% for polarization and 40% for the mirror loss for a net 20%. Even with very expensive components, the best case would likely be less than 17% efficient (nothing is perfect). The key point here is while 14% throughput may seem low, it is not that far off of what is possible to achieve.
The real-world to the eye light path is dominated by the polarization (50%) and the partial mirror 60% for a best-case theoretical net of 30%. Once again, you could play with the mirror reflection/pass ratio to improve the real-world light, but only at the expense of severely hurting the display light throughput.
The best display or real-world light path with a polarizing beam splitter is about ~50%. But to get 50% on one path means the other path has 0% transmission, and you quickly reach diminishing returns as the ratio becomes more asymmetrical. If they had a very bright display device, they might consider going 30/70 or even 20/80.
The other type of birdbath (described in the appendix of Part 1 and show below) where you look through the beam splitter alone does not have this trade-off. This type of birdbath’s theoretical limit is 50% in both paths. But it also has other problems described in part 1. In a practical design, you might get about 40% to 45% real-world light transmission, still not really “transparent.” You also need a front polarizer with this type of design or there will be significant front projection.
It may at first seem counterproductive to polarize the OLED’s light and thus lose nominally 50% of the light. The reason is that the light has to reflect off the beam splitter and then later pass through the same beam splitter. If a 50/50 non-polarizing beam splitter were used, 50% of the light would be lost on each pass, and thus only 50% x 50% = 25% would get through. Because the QWP manipulates the polarized light, it is more efficient as there is only a single 50% loss (nominally) of light due to polarization.
The effect of the combination of the front polarizer, the QWP (QWP1) on the front polarizer, and the QWP used on the inside of the partial mirror is shown in a series of photographs (right). There is a polarizer aligned with the polarization of the beam splitter before the camera. To show the effect of QWP1, I have removed that film on the right side. Note the second photograph, particularly how, without QWP1, the mirror’s QWP would cause light to be blocked.
The effect of QWPs is a function of their rotational orientation relative to the polarized light. This effect is demonstrated in the three bottom pictures.
QWPs are usually made of thin plastic films/sheets and are very transparent. They “circularly polarized” (when oriented correctly) and retard its phase by a quarter-wave, either left or right. If you circularly retard light by a quarter-wave in the same direction twice, effectively a “half-wave” rotation, the light will be linearly rotated by 90 degrees. Light reflected off a mirror will cause the circular polarization to go from the right or left circular polarization to the opposite circular polarization. Linearly polarized light off a mirror will stay in the same polarization. QWP’s (and half-wave-plates) ability to manipulate the polarization of polarized light makes them extremely useful in birdbaths and other optics.
As I wrote last time, for a full white image, the Lumus Maximus appears to be about 30 times more efficient than Nreal. The reasons are very complex and why it is going to take at least a whole article.
A point I failed to make is that with an emitting display like an OLED when used for AR, the image content should be sparse most of the time so the user can see the real world. If you are using an AR display as a monitor replacement, then you are doing something wrong. For most applications, there may be less than 10% of the pixels that are turned on. So in practical use, the efficiency difference could be narrowed considerably, as in by more than 10x.
These numbers show the tight design box of such a design. The biggest long-term problem is the amount of light it blocks to the eyes. By the standards of AR optics, the efficiency is much better than most when using a display with reasonably diffuse (Lambertian) light. This is why it often gets paired up with OLED-microdisplays. The physics of why Lambertian-like displays don’t work well with waveguides is a topic for another article.
The major designer-controlled variable is the percent coating of the partial mirror. Nreal’s partial mirror is 40% reflective and 60% passing. If the coating was changed to 80% passing and 20% reflective, the light from the display would be halved, but the light to the eye would only go up from 27% to 35%. And at this point, you are at severely diminishing returns for making the mirror more transmissive.
Taking out the polarizing front cover with the QWP would result in much worse “glowing eyes” and “mirror eyes” effects with minimal improvement in real-world light to the eyes. So it was a good design choice. Furthermore, the polarizing cover also prevents real-world light from bouncing around the optics cavity to help image quality.
Nreal design could be improved slightly with better materials, mostly in the beam splitter. But even with better material, there were only be a few percentage points of net light throughput improvements. The other area where they seem to compromise to cost is with the single glass lens on the OLED. This design decision seems to have caused the lens optics to be visible somewhat (not horrible, but you can see it).
The use of glass near the eye might be a problem for some applications. It might run afoul of some safety regulations. The use of plastic might be more eye-safe but would have to be thicker to maintain the flatness required.
Overall, it seems Nreal made good cost versus performance trade-offs, given they were making a birdbath AR headset.
In terms of image quality, the Nreal is among the best I have seen. It offers about a 52 (diagonal) degree FOV, and it has a 1080p (1920×1080 display). Its effective resolution is about 2x in each direction, better than the Hololens 2 with about the same FOV.
It is still not great compared to a typical flat panel display as optical reflections are causing double images, but it is very good compared to most other AR displays.
The next article in this series will show through-the-optics pictures. Below is one that I will be discussing.
I want to thank David Bonelli, the CEO of Pulsar, an engineering consulting company (and a consultant to Ravn), for reviewing my analysis and giving feedback. From 2015 to 2018, David works at the Osterhout Design Group. His work included working on the ODG R8 and R9, similar in basic structure to Nreal’s design.
Excellent as always, thank you. I tend of think of birdbaths as being the simpler design vs. waveguides and while that might be true we can see here they are anything but simple, at least when made so small. Amazing engineering on display here.
Birdbaths AR optics are definitely simpler than waveguides. Most importantly, they work with Lambertian (like) emmiters. If you put an OLED on a diffractive or Lumus waveguide, you would be lucky to get about 1 nit out.
Birdbath-type structures are also buried inside of many optical designs, including many of the waveguide projectors. Buried inside the Lumus Maximus projector is a structure similar to the Nreal (better components and smaller). Curve mirrors, compared to lenses, are cheap to make and design and don’t cause chroma aberrations. A big downside of mirrors is that the like goes back in the same direction it comes it, so you need something like a beamsplitter to have the light hit them correctly.
One could argue that the Nreal was designed as inexpensive as possible. The only expensive thing in the design is the Micro-OLED. At the same time, they would not have improved things very much by putting in much more expensive components. So I would say it is a “cost-efficient” design. Using a birdbath as their main optics puts them in a design box where it is relatively inexpensive to get good image quality and good performance with Lambertian light sources. But, there is little room to improve it. They could go brighter with MicroLEDs, but those are a ways off and very expensive. The drawback they can’t improve on much is the light throughput from the real world.
Appreciate all the work putting this teardown together.
Wish it was possible to see the pcb in more detail with chip info.
[…] More info (HTC Vive Pro 2 review) More info (HTC Vive Pro 2 review — cached link if the previous one doesn’t work) More info (Pico Neo 3 vs Quest 2) More info (Decamove review) More info (Nreal teardown) […]
Karl you mentioned:
“The Sony OLED has about 1-Watt of power going to it at full brightness for an all-white image.”
Do you know how many nits at this condition?
It appears to output about 850 nits at 85 mA
mWat 10v [corrected] that drives the OLED illumination. There some other supplies for the backplane.
Follow-up question- is it 85mW or 85mA of 10V?
Also, the 850 nits seems to be lower than the Sony spec for max brightness, correct?
It is 85mW at 10V = 850mW. Sorry about that. I was typing on my phone right before my flight took off.
There are different brightness versions of the Sony OLED. I don’t have the grade numbers, but I am guessing it was a 1000 nit version.
Could you please clarify confusion with nits:
CP display spec 3000nits/w + Maximus output to eye 650nits = Maximus is ~20% efficient in brightness
Nreal 1000nits/w Sony/BOE display spec + Nreal output to eye 110nits = Nreal is ~10% efficient in brightness
– Is Maximus really x30 more power efficient than Nreal (I see x3 based on nits/w or x6 if account for brightness) ?
– Where could i read about this nits/lumen metric? = 650 nits/lumen would imply that display brightness is 650/3.426 times higher that output to eye = I am sure this is not how it’s meant to work??
First of all, it probably by design to be confusing. What most people should care about is Watts to the display (power in) vs nits to the eye (output to the eye) for a given FOV and eye box. But they won’t give that number.
Lumens are a measure of total light output. Nits are a measure of light in a given solid angle, or more broadly light in a given direction. With near-eye displays, what “counts” is the light that enters the pupil of the eye. We are not trying to light up the whole room like a light bulb, but just funnel the light into the pupil. I wrote about lumens, nits, and etendue in: https://kguttag.com/2017/08/10/collimation-etendue-nits/. The physics laws of etendue limit how you can manipulate light and amount other things, limit how the percentage of light with given randomness you can get through a small hole.
When working with display devices you have to deal not only with the amount (lumens) of light but also its randomness (etendue). Light tends to randomization (becomes diffuse). Smaller light sources (say small LEDs) have better etendue whereas larger emitting areas have worse etendue assuming both emit similar randomness. Both LEDs and OLED tend to be Lambertian emitters. So a small LED emitting the same Lumues as a larger OLED, will have much better etendue and thus can be with optics turned into much higher nits.
We start with the energy conversion. Small LEDs are much more efficient in turning electricity (electrons) into light (photons/lumens). Because the area of small inorganic LEDs is much smaller, you can get on the order of a million nits (VERY roughly) onto an LCOS microdisplay for the same power as an OLED produces about 1,000 nits. So you start out with about a 1000:1 advantage in the ability to cover (extremely loosely speaking) Watts to nits. Then we have all issues of light loss through the optics.
In the case of waveguides, they have highly collimated light funneled into a small entrance area and there are high losses both in “coupling” the light in and in the waveguide itself. Lumus’s reflective waveguides tend to be significantly more efficient than diffractive waveguides, but they all suffer the etendue coupling loss.
As far as the “nits/lumen” metric, I think this is talking about the output of collimated nits from the projector versus nits to the eye. Note you have to know the size of the eye box and the FOV for this to make any sense. It is an intermediate metric and there are a lot of caveats so it is risky to compare two different company’s numbers.
The simplest and more and what I think most people should care about in AR is electrical Watts-in versus nits to the eye. There is a lot of complex math and many factors in between. The short answer is that with about 850 milliWatts-in Nreal delivers about 120 nits to the eye. Lumus with about 1 Watt to the LEDs get over 3,000 nits and claims they will be able to get more than 4,500 nits or about 30 times the nits.
Thank you Karl,
Perhaps an example of converting 650 nits/lumen to nits at display given 3000 nits at the eye would be of most help.
Digilens specs their Crystal30 as 1% efficient. Is there any official info on efficiency from Lumus to back this conversion calculation?
To my mind the lower this nits/lumen number is the better waveguide efficiency i.e. increasing nits at the eye lowers ratio?
Personally, i would prefer 2 numbers in spec sheets for a more complete picture of components:
nits/watt at display instead of at the eye
waveguide efficiency percentage
There is so much that goes into the efficiency number. Very importantly is the “character” (etendue) of the light. It gets very hard to compare.
The “efficiency” of say the Nreal birdbath would blow away the efficiency of the Lumus Maximus by perhaps orders of magnitude. Yet the Lumus Maximus Watt-in to Nits-to-the-eye is something like 30 times better. The reason is that Lumus starts with a small light source (much better etendue) and the LED generates photons much more efficiently than the Micro-OLED. If you tried to could the Sony OLED into the Lumus Maximus, you might be lucky to get 1 nit out.
Thus it seems to me the number that makes the most sense is the in-to-out performance.
Suppose we can agree on nits-out/watt-in making good sense, however, I would feel somewhat cheated without knowing display nits. Seems suss Lumus/CP do not specify light pipe brightness.
What would be good to see is CP LCoS or 150k nit Jade Bird module added to Nreal bird bath in clubmaster casing.