CES (Pt. 3), Xreal, BMW, Ocutrx, Nimo Planet, Sightful, and LetinAR

Update 1/28/2024 – Based on some feedback from Nimo Planet, I have corrected the description of their computer pod.

Introduction

The “theme” for this article is companies I met with at CES with optical see-through Augmented and Mixed Reality using OLED microdisplays.

I’m off to SPIE AR/VR/MR 2024 in San Francisco as I release this article. So, this write-up will be a bit rushed and likely have more than the usual typos. Then, right after I get back from the AR/VR/MR show, I should be picking up my Apple Vision Pro for testing.

Xreal

Xreal (formerly Nreal) says they shipped 350K units in 2023, more than all other AR/MR companies combined. They had a large booth on the CES floor, which was very busy. They had multiple public and private demo stations.

From 2021 KGOnTech Teardown

This blog has followed Xreal/Nreal since its first appearance at CES in 2019. Xreal uses an OLED microdisplay in a “birdbath” optical architecture first made popular by (the now defunct) Osterhout Design Group (ODG) with their R8 and R9, which were shown at CES in 2017. For more on this design, I would suggest reading my 2021 teardown articles on the Nreal first product (Nreal Teardown: Part 1, Clones and Birdbath Basics, Nreal Teardown: Part 2, Detailed Look Inside, and Nreal Teardown: Part 3, Pictures Through the Lens).

Inherent in the birdbath optical architecture Xreal still uses, they will block about 70% of the real-world light, acting like moderately dark sunglasses. About 10% of the display’s light makes it to the eye, which is much more efficient than waveguides, which are much thinner and more transparent. Xreal claims their newer designs support up to 500 nits, meaning the Sony Micro-OLEDs must output about 5,000 nits.

With investment, volume, and experience, Xreal has improved its optics and image quality. It can’t improve much over the inherent limitations of a birdbath, particularly in terms of transparency. Xreal recently added an LCD dimming shutter to selectively block out more or all of the real world fully with their new Xreal Air 2 Pro and their latest Air 2 Ultra, for which I was given a demo at CES.

The earlier Xreal/Nreal headsets were little more than 1920×1080 monitors you wore with a USB-C connection for power and video. Each generation has added more “smarts” to the glasses. The Air 2 Ultra includes dual 3-D IR camera sensors for spatial recognition. Xreal and (to be discussed later) Nimo, among others, have already picked up on Apple’s “Spatial Computing,” referring to their products as affordable ways to get into spatial computing.

Most of the newer headsets will support either via a cell phone or Xreal’s “Beam” compute module, which can act to mirror or cast one more virtual display from a computer, cell phone, or tablet. While virtually there may be more monitors, they are still represented on a 1920×1080 display device. I believe (I forgot to ask) that Xreal is using internal sensors to detect head movement to virtualize the monitors with head movement.

Xreals Air 2 Ultra demo showcased the new spatial sensors’ ability to recognize hand and finger gestures. Additionally, the sensors could read “bar-coded” dials and slides made from cardboard.

BMW AR Ride Concept (Using Xreal Glasses)

In addition to seeing Xreal devices on their own, I was invited by BMW to take a ride trying out their Augmented Reality HUD on the streets around the convention center. A video produced by BMW gives a slightly different and abbreviated trip. I should emphasize that this is just an R&D demonstration, not a product that BMW plans to introduce. Also, BMW made clear that they would be working with other makes of headsets but that Xreal was the most readily available.

To augment using the Xreal glasses, BWM mounted a head tracking camera under the rearview mirror. This allows the BMW to lock the image generated to the physical car. Specifically, it allowed them to (selectively) block/occlude parts of the virtual image hidden behind the front A-pillar of the car. Not shown in the pictures from BMW below (click on the picture to see them bigger) is that you could see the images would start in the front window but be hidden by the A-pillar and then continue in the side window.

BWM’s R&D is looking at driver and passenger AR glasses use. They discussed that they would have different content for the driver, which would have to be simplified and more limited than what they could show the passenger. There are many technical and government/legal issues (all 50 states in the U.S. have different laws regarding HUD displays) with supporting headsets on drivers. From a purely technical perspective, a hear-worn AR HUD has many advantages and some disadvantages versus a fixed HUD on the windshield or dash combiner (too much to get into in this quick article).

Ocutrx (for Low-Vision and other applications)

Ocutrx’s Oculenz is also using “birdbath” optics with the OcuLenz. The OcuLens was originally designed to support people with “low vision,” especially people with Macular Degeneration and eye problems that block parts of a person’s vision. People with Macular Degeneration lose their vision’s high-resolution, high-contrast, and color-sensitive parts. They must rely on other parts of the retina, commonly called peripheral vision (although it may include more than just what is technically considered peripheral vision).

A low-vision headset must have a wide FOV to reach the outer parts of the retina. They must magnify, increase color saturation, and improve contrast over what a person with normal vision would want to see. Note that while these people may be legally blind, they still can see, particularly with their peripheral vision. This is why a headset that still allows them to use their peripheral vision is important.

About 20 million people in the US alone have what is considered “low-vision,” and about 1 million more people each develop low-vision as the population ages. It is the biggest identifiable market I know of today for augmented reality headset headsets. But a catch needs to be fixed for this market to be served. By the very nature of the people involved, having low vision and often being elderly, they need a lot of professional help while at the same time being often on a fixed or limited income. Unfortunately, rarely will private or government (Medicare/Medicaid) insurance will rarely cover either the headset cost or the professional support required. There have been bills before Congress to change this, but so far, nothing has happened of which I am aware. Without a way to pay for the headsets, the volumes are low, which makes the headsets more expensive than they need to be.

In the past, I have reported on Evergaze’s seeBoost, which existed in this market while developing their second-generation product for the economic reasons (lack of insurance coverage) above. I have also discussed NuEyes with Bradley Lynch in a video after AWE 2022. The economic realities of the low-vision market cause companies like NuEye and Ocutrx to look for other business opportunities for the headsets. It is really a frustrating situation knowing that technology could help so many people. I hope to cover this topic in more detail in the future.

Nimo Planet (Nimo)

Nimo Planet (Nimo) makes a small computer that acts as a spatial mouse pointer for AR headsets with a USB-C port for power and video input. It replaces the need for a cell phone and can send mirror/casting video information from other devices to the headset. Still, Nimo Core is a fully standalone computer with Nimo OS, which simultaneously supports Android, Web, and Unity Apps. No other host computer is needed.

According to Nimo, every other multi-screen solution in the market is developed in web platforms or UnityApp, which limits them to running only Web Views. Nimo OS created a new Stereo Rendering and Multi-Window architecture in AOSP to run multiple Android, Unity, and Web Apps simultaneously.

Nimo developed their glasses based on LentinAR optics and supports other AR glasses. Most notably, they just announced a joint development agreement with Rokid.

I got a brief demonstration of Nimo’s multi-windows on an AR headset. They use the inertial sensors in the headset to detect head movement and move the view of the multiple windows accordingly. It is like you are looking at multiple monitors through a 1920×1080 window. No matter how big the size or number of virtual monitors, they will be clipped to that 1920×1080 view. This device lets you move your head to select what you see. I discussed some of the issues with simulating virtual monitors with head-mounted displays in Apple Vision Pro (Part 5A) – Why Monitor Replacement is Ridiculous, Apple Vision Pro (Part 5B) – More on Monitor Replacement is Ridiculous, and Apple Vision Pro (Part 5C) – More on Monitor Replacement is Ridiculous.

Sightful

The Sightful is similar to the Nimo Planet type of device in some ways. With the Sightful, the computer is built inside the keyboard and touchpad, making it a full-fledged computer. Alternatively, Sightful can be viewed as a laptop computer where the display uses AR glasses rather than a flat panel.

Like Nimo and Xreal’s Beam and many other new Mixed Reality devices, Sightful supports multiple windows. I don’t know if they have sensors for 3-D sensing, so I suspect they use internal sensors to detect head movement.

Sightful’s basic display specs resemble other birdbath AR glasses designs from companies like Xreal and Rokid. I have not had a chance, however, to compare them seriously.

LetinAR

I have been writing about LetinAR since 2018. LetinAR started with a “Pin Mirror” type of pupil replication. They have now moved on to a series of what I will call “horizontal slat pupil replicators.” They also use total internal reflections (TIR) and a curved mirror to move the focus of the image form an OLED microdisplay before going to the various pupil-expanding slats.

While LetinAR’s slat design improves image quality over its earlier pin mirrors, it is still imperfect. When looking through the lenses (without a virtual image), the view is a bit “disturbed” and seems to have diffraction line effects. Similarly, you can perceive gaps or double images depending on your eye location and movement. LetinAR is working on continuing to improve this technology. While their image quality is not as good as the birdbath designs, they offer much better transparency.

LetinAR seems to be making progress with multiple customers, including Jor Jin, who was demonstrating in the LentinAR booth, Sharp, which had a big demonstration in their booth (while they didn’t say whose optics were in the demo, it was obviously LentinARs – see pictures below), and the headset discussed above by Nimo.

Conclusions

Sorry, there is no time for major conclusions today. I’m off to the AR/VR/MR Conference and Exhibition.

I will note that regardless of the success of the AVP, Apple has already succeeded in changing the language of Augmented and Mixed reality. In addition to almost everyone in AR and Mixed reality talking “AI,” many companies now use “Spatial Computing” to refer to their products in their marketing.

Karl Guttag
Karl Guttag
Articles: 260

9 Comments

  1. Do you have an opinion on this headset? https://shop.visor.com/pages/visor There aren’t any demos of it and it’s essentially vaporware. It’s advertised as for productivity but based on your apple headset text clarity estimates post, this visor isn’t going to be really usable for text either. The apple headset and the visor have similar pixel counts (23million and 27million)

    • I would not trust the Visor. They say they have 4K per with a price of $400. That is simply not possible to do legally.

      As for APV’s resolution. It will likely be good enough for reading text (they can make the text a little bigger), but I’m expecting the text density to be less than a physical monitor. I’m are less than a week away from knowing more as I pick up my AVP on Saturday the 3rd.

      • TBF $400 is a subsidized up-front cost with a software and support membership plan that costs more in the long run. The price of the device by itself is listed as $950. That doesn’t leave much room for profit though, and nothing else about the company or site inspires much confidence.

  2. Thank you for all the information, you’re such a wonderful resource. Do you think you’ll perform a full breakdown of the Quest 3?

    • Thanks. Right now my plan is to contrast the Quest 3 with the Apple Vision Pro and other headsets.

  3. I think XREAL uses Electrochromic (EC) to adjust Dimming…. Not LCD.
    Does anybody know who is supplying EC Tech for XREAL?
    It is not a rich EC similar to Magic Leap’s which has partial dimming…

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading