CES 2019 – Nreal

Quick Note On Photonics West Next Week

I’m going to Photonics West next week in San Francisco. I plan on attending the AR/VR sessions on Feb. 3rd and 4th followed by the main show on the 5th and 6th. If you would like to meet, please email me at pw-meet@kgontech.com.

Update On Nreal’s Transparancy January, 31, 2019

I have been contacted by Nreal that they don’t agree with the transparency measurement I have made. I measured roughly 10% linearly with my camera’s photograph of white paper. In the past, I have found this to be a reasonably accurate way to estimate transparency in comparisons between my camera and light meters. Nreal used a light meter (which I forgot to bring to the show flow) and they get between 25% and 30%. This number percentage is certainly possible to obtain with the optics they are using. So I am adding this note and will check in more detail the next time I have a chance.

2019-June: I met with Nreal at AWE was able to take measurements to confirm that the Nreal optics are about 30% transparent.

Introduction – Picking Up From the Summary Article

As I wrote last time, Nreal caused a lot of buzz both in the AR hall in news articles. They have a very simple “birdbath optical” design with a Sony 1080p Micro-OLED display. With a former Magic Leap person founding the company and a similar feature set and even system configuration, the comparisons are inevitable. waveguide based solutions, other than Magic Leap, have a major advantage over most simpler AR optics.

Nreal – Blowing away Magic Leap on Image Quality

Nreal has 1080p resolution and delivers about 3X the horizontal and 2X the effective vertical resolution of Magic Leap with much better contrast and far fewer artifacts. In a way, Nreal demonstrates the poorness of Magic Leap’s optics with a direct comparison. As I discussed in my comparison of Magic Leap to Hololens and Lumus, Magic Leap optics significantly blur/soften the native display resolution and only deliver about 640 by 480 resolution to the eye.  And in an earlier article, I showed how the Magic Leap One has a lot of other optical issues that are not present in the Nreal design.

Nreal — Shot through optics (click to see the full resolution image)

System Configuration

Nreal was founded by a former Magic Leap engineer, Chi Xu, and has a similar system configuration with a separate computer pack with battery and a separate controller. Like Magic Leap and Hololens, it supports 6 degrees of freedom tracking and SLAM (simultaneous localization and mapping).

Just the headset is only 85 grams compared to the Magic Leap One which tips the scales at about 335 grams. The Nreal has a much smaller (about 3 x3 inches) compute and battery pack which they call “Toast,” and a disked shaped controller which they call “Oreo.” I suspect they have less processing power and less capable SLAM mapping, but I have not evaluated these aspects of the overall design. Nreal did say that they are still working on improving the SLAM.

The cable is wired into the headset on one side and has a standard USB-C connection on the other side. I very much liked the idea of using a standard USB-C connection, but they should have used it on both the Compute pack and the headset side. By having the USB-C on both ends, there would be a “breakaway” feature to the headset if the cable gets snagged as well as support the use of different length USB-C cables. This would enable using a very short USB-C cable to put the compute pack on the back of a headband virtually eliminating the cable snag hazard.

Keep It Simple Stupid (KISS) Optical Design

A classic case of Keep It Simple Stupid (KISS), there is absolutely nothing remarkable about the optical design. Some form of birdbath (beam splitter and spherical mirror) optics very commonly used in near-eye displays (see the figure from an ODG patent). I should emphasize that many companies have used a similar birdbath design. The reason the birdbath is so popular is that it gives relatively good image quality, particularly with respect to chroma aberrations, at a very low cost.

Issues With The Birdbath Optical Design

Update (January 31, 2019) – I have been contacted by Nreal and they claim that when measured by a meter, their optics let through between 25% and 30% of the light which translates to blocking 75% to 70% of the light. This new value would be in the expected range for the type of birdbath optics Nreal is using. As I only made a quick measurement of a demo system on the floor and with Nreal providing a measurement, I’m adding this note and striking through the text below. Assuming the Nreal information is correct, then they would be more transmissive than Magic Leap.

There is literally a “dark side” to the birdbath in that it blocks a lot of light. In the case of the Nreal headset, I found that it blocks about 90% of the real-world light which is more than most dark sunglasses and more than Magic Leaps blocking of 85%. Additionally, controlling double images due to reflections is a well know problem with birdbath designs. I discussed the issues with the birdbath design back March 2017.

The birdbath is “thick” due to the beam splitter at 45 degrees. The Nreal optical design is about 1-inch (25mm) front to back. While they may look like ordinary glasses from the front, they are much thicker. While the weight is “only” 85 grams which is light for a headset, that is very heavy for glasses. Also due to the 1-inch thickness, the weight is very far forward on the nose.

It is worth point out that while waveguide-based optics start out as relatively thin glasses, by the time they are encased in protective shields and with SLAM cameras and other features, they are often thicker than the Nreal optics. For example, Magic Leap Ones are about the same thickness as Nreal and Hololens are much thicker.

An obvious problem with the Nreal design is that the 45-degree beam splitter can direct light from below it into the eye. I have shown the light path with a red arrow in the picture above, and there is a reflection of my badge which was below the glasses in the through the optics picture below. This reflection could be easily fixed by adding some form of the eyecup. This might also be an area where something like eye tracking could be added. In the picture below, I also added some arrows pointing to some double images I see in the image. The number of reflections does not seem as bad as they were in the similar ODG R9 design, but I have not had a chance to do a rigorous evaluation.

This problem could be solved with some form of structure to block light from below. This structure would have to be compatible with their vision inserts.

The headset gets too hot against the forehead. It was not that noticeable in the quick demos that shut down when a person was not viewing them but was a common complaint with users in the demo area. Nreal is aware of this problem and said that these are just early prototypes. Still, it would be nice if they had thought of better thermal management from the start. I expect that fixing this problem will add some weight to the headset.

Vision Correction Support

As with all other “glasses like” AR designs such as Magic Leap and ODG,  vision correction is supported only via custom lens inserts. Then there is North Focals where you have to go to their stores to get custom lenses made at their stores.

There is then as the obvious drawback of needing custom inserts made. Many people do not have vision that can be corrected by simple diopter problems and need fully custom lenses.

There is not an alternative to inserts for a small form factor. If they make the glasses big enough to accommodate even relatively small “normal” frames for glasses, the AR glasses become much larger. Additionally, accommodating existing frame would cause the optics have to be further from the eye, which in turn would result in larger, heavier, and likely more expensive optics to keep the field of view and eyebox the same.

Thus, you see a gap between the size of headsets like Hololens that support wearing most vision correction and Magic Leap and Nreal that use inserts. It is a major design trade-off between ease of use and size.

Conclusions and Suggestions

Nreal had a very simple optical design and coupled it with a Sony high contrast 1080p OLED which resulted in a very good image. To a large degree, the Nreal demonstrates the sacrifices made in image quality by other AR/MR headset such as Magic Leap and Hololens.

As I wrote above, I would highly recommend they would add a USB-C socket on the headset rather than wiring it in. This would allow flexibility in the length of cable and even support a self contained headband or cap.

The image quality and simplicity of the optics comes at the expense of poor real-world light throughput and display inefficiency. Nreal or others competing with them could tweak the reflection percentage in the birdbath design to improve the light blocking of the real world at the expense of a dimmer image or need to drive the display harder.

They certainly need to work on their heat management in the design and this, in turn, should have a negative effect on size and weight. While this may seem simple, it may proved to be a significant problem.

They are doing in-system computing, 6 DOF tracking and some level of SLAM, but I did not evaluate their capabilities relative to the likes of Hololens and Magic Leap. This blog has enough to do just covering the optical designs.I would highly recommend using USB-C on both the glasses side as well as the Computer side of the glasses cable. In this way, the user could easily choose a shorter cable. Then with a short USB-C cable, they could have a “cap,” visor, or headband option to mount the “Toast” (Computer and Battery) on the back of the user’s head. I have found the cable on Magic Leap to be a serious snag and dragging hazard that can cause the headset to be broken. 


Karl Guttag
Karl Guttag
Articles: 257

29 Comments

  1. Karl,

    Thanks for the thorough analysis. I have seen you mention multiple times that you think microLED is the future of near-eye display technology (and probably displays in general). Do you think microLEDs will be used with birdbath optical designs or waveguide-based optics or something else entirely? It feels like AR devices are improving, but the image quality and real-world light throughput still have a long way to to go. I don’t know how or if microLED will help overcome the issues you regularly point out with these common optical designs.

  2. Thanks Karl for the write up!
    I tried the nreal glasses at CES and agree to all your comments.
    From my point of view nreal looks great, but they compromised a bit too much on the size side: Both glasses and the compute unit currently get unacceptably hot (I really could not hold the compute unit in my hands, so I held it on the USB plug). Fixing that will most likely not only add weight, but also size and hence result in a noticeable design change.
    They also still need to do significant work on the software side. The tracking system is ok, but not ready for prime time yet and they definitely need to make their software more efficient to reduce the heat of the compute unit.

  3. Nice article as always !

    Even with the loss of light, birdbath design seems like a no-brainer (cost, quality, fov).

    Why, in your opinion, MS & ML went for the waveguide solution ?

  4. Hello Karl,

    thank you very much for the review.

    Glasses that block 90 % of the light are more VR than AR or MR glasses. There is likely virtually no use case for such glasses for AR and MR applications. Workers will never be allowed to wear them during their usual activities. The risk of an industrial accident is much to high because hazards will be likely overseen as well as their environment. They will simply not see anything… Same will be the case for people on streets and so on. They will not see early enough cars, other people etc. as well things lying on the street etc.

    So, this approach is a meander in case of AR and MR if they cannot improve dramatically what people see of the real world.

    Thanks!

    • It seems AR is breaking into several subcategories. When they start blocking over 50% of the light, they start becoming more “augmented VR.” The image quality of the virtual is better at the expense of the real world. But you still have at least some of the image quality issues caused by the optical compromises to make the optics (somewhat) see-through. The “enhanced VR” seems to be the market that Magic Leap and nreal are trying to get. In typical industrial use, it would seem about 20% light blocking is about as much as can be allowed (based on discussions with people in the field). You also have a lot of concern of blocking peripheral vision.

      The problem is that the general perception is that the VR market is never going to be that big, thus Facebook is pushing Oculus to develop AR. The hope is that the “ideal AR” will be the next big thing after cell phones.

      The idea of what constitutes “AR” should be keeps changing. On the one end, some want it to be a replacement for a cell phone and be very light and unobtrusive. On the other end, they expect something where the virtual world looks better. The marketing people like to conflate the two approaches to sell the concept. The result is a series of unhappy compromises as they try to actually build products.

  5. Great insights as always. The only thing I don’t buy is the notion of these glasses as a form of ‘enhanced VR’. The old Milgram Reality-Virtuality Continuum used the concept Augmanted Virtuality. An experience has to have more of the virtual than the real for something to fall in the latter category and for the ML/Vreal devices most of what you see is the real world. Granted with the very narrow FOV of reality you get from LM you may argue you get that but I don’t buy it. It is more AR than VR to me.

  6. About the transparency I guess your measurement 10% was right. Because your measurement include the cover. but N company declear is 25%~30% Which doesn’t include the cover?50%*50%=25%, why 30%, perhaps there is some leaking of ambient light .

  7. About the transparency I guess your measurement 10% was right. Because your measurement include the cover. but N company declear is 25%~30% Which doesn’t include the cover?50%*50%=25%, why 30%, perhaps there is some leaking of ambient light .

  8. Karl, excellent article again! What are the specs of the Sony microLED you mention that nReal is using: horizontal pixels sounds like 1920 but what about vertical and what about diagonal FOV?

    Is this state-of-the-art Sony microLED? Is Sony the leader?

    I think I read that nReal is targeting $1000 to $1500 … so, why wouldn’t Microsoft have chosen microLED versus MEMS?

    • From what I remember (I’m traveling right now), nReal has about a 50 degree diagonal. The Sony device is 1080p (1920×1080) which is a 16 by 9 aspect ratio. Sony seems to be at least the volume leader in OLED.

      I have not heard Microsoft rational. I suspect they wanted the “glasses like” look of Waveguides. They acquired the diffractive waveguides technology from Nokia. OLEDs won’t work with a diffractive waveguide and MicroLEDs are not ready. I think they would have been better off to choose something other than LBS, but maybe they overthought it.

      • They are just making lab prototypes at this stage. You can spend large amounts of money to hand build a prototype. A prototype often can have many dead pixels. Then the next big challenge is color.

  9. Instead of such Bird Bath optical design, why do not chose Freeform optics?To me it seems that bird bath with polorization control would decrese optical efficiency, while use Freeform optics, they do suffer a minimum efficiency drop compared to Birth Bath. Would it be the reason for cost down purpose?

    • A reasonable question. One of the big issues with a freeform optics design is that it is solid and generally heavy. Worse yet, to be seethrough/AR you have to have a “corrector” which is a chunk of optics on the other side of the freeform that adds to the weight. The corrector is necessary for AR or else the real world view will be very distorted. The net is you have a very thick and heavy piece of optics you are looking through. If you make the freeform optics big enough to cover most of the user’s field of view, then it gets very big and heavy (see: https://www.researchgate.net/publication/269321526_Eyetracked_optical_see-through_head-mounted_display_as_an_AAC_device/figures?lo=1). If you make it smaller, then the user will see the edges of the freeform optic chuck.

      Then you get to the optics design and cost issues. Here I think birdbath also has an advantage. It seems that designers overwhelming chose a birdbath over freeform optics when it comes to seethrough optics. Even with non-seethrough, I think there are more birdbath than freeform designs, particularly since a birdbath is usually much cheaper to make.

      • Thanks for the comments, the other questions regards to the FOV design of the Bird Bath archtechule is that the FOV is defined as display size over effective focal length. so that the FOV is proptional to selected MicroOLED, uless it shorts the focal length or minimize the #f , would that be the case in the future, limits the large FOV design?

      • It is said that the FOV=panel size*amplification of the optics.
        And considering the size of existing panels, most of the time, it will not be the limit.

  10. […] Even with the optical combiner Sony is using, they likely going to have losses in nits of greater than 10 to 1. So if starting with ~1000 nit micro-OLED, they are probably only getting about 100 nits or less out to the eye. The light loss from the Sony type, or worse yet waveguides optical combiners is why companies like nReal went with a much simpler birdbath structure when using Micro-OLEDs. […]

  11. Hi Karl,
    I have been studying the birdbath design for a while and the only thing I dont fully grasp is the lens array. Where is NReal focusing their image at exactly? If the image ends up being focused at 3m away (like current VR) headsets, does that mean its focused at 3m as its going into the beamsplitter?

    Lenses are definitely one of my weak points, even if I know the equations behind them, so sorry if the question isnt a great one.

    Thanks a ton.

    • With a birdbath design, much if not all the change in focus is caused by the spherical (or nearly spherical) semi-mirror. In this case, the ODG R-9 which looks to be very similar to Nreal (see: https://www.kguttag.com/2019/01/29/ces-2019-nreal/). In the ODG R9, and I suspect in the Nreal, there are also some refractory lens prior to the beamsplitter and mirror that will also affect the focus. But the curved mirror is still doing much of the focusing “work.”

      The mirror is used in a condition where the “object” (display) is between the focus of the mirror (which is 1/2 the radius of curvature) and the mirror. This results in a virtual image that is non-inverted, magnified and appears to be further away. The closer the display gets to the focus, the more the focus and magnification changes but the less stable (small movement or imperfections get extremely magnified) the image becomes. The curvature of the mirror, the location of the display to the mirror (plus any other optics), and the distance of the eye from the mirror all affect the focus and magnification. This is pretty much “physic 101.”

      Birdbaths and the use of (nearly) spherical mirrors is extremely popular in most near-eye optical designs. Sometimes the birdbath optical elements are used before the combiner. The reason for using curve mirrors is that you can get extremely good image quality at very low cost and with zero chroma aberrations (separating of colors based on wavelength). The downside of curved mirrors is that they are opaque and you need a beam splitter to route the light in and out which makes them inefficient.

      The reason for the beam splitter is so you can have the eye “on-axis” to the mirror. Being on-axis results in both the magnification and focus being constant across the image. The large “bug-eye” optics used in the Meta-2, Mira, iGlass and many others use a curve mirror but the display and the eye are off-axis. The mirror is the only optics element and it is changing both the focus and magnification. Some of the bug-eyes will change the curvature of the mirror from spherical to try and reduce the distortion but there is no way to do a very good job of correcting for distortion and focus caused by being off-axis without significant other optics that they don’t have. Most bug-eyes will “digitally correct” for the distortion by pre-distorting the image, but they cannot fix the focus (it ends up with the focus changing from top to bottom of the image).

  12. I met the NReal people in Beijing — they are a fun group. The other thing they “inherited” from Magic Leap was a lawsuit … maybe weave something about the use of let us say, perhaps dubious IP lawsuits as a weapon of those with a lot of money against those w/o.

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading