304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
2019-06-28: Corrected Nreal’s light throughput to be 30%
I had another busy four days a CES this year, followed by catching a lingering cold and having a business trip. So, I am just starting to write about what I saw at CES. There are about 20 companies that I visited and want to write about in future articles (some of my meetings were confidential).
Inbetween seein demos of various AR and display hardware, I was feeling around and confirming suspicions for what technology Microsoft is using in the next Hololens, which is expected to be announced soon. I have some answers to reveal in an upcoming article.
Below are summaries of some of the more interesting thing I saw at CES 2019. While I mostly concentrated on AR headsets, I did looks as some other displays technologies. I hope to circle back to provide more detail about these products in future articles.
Correction: I originally reported that Nreal’s light throughput was only 10%. Subsequent measurements have proved the Nreal optics to have ~25% light throughput (blocks 75% of the real world light)
Nreal caused a lot of buzz in the AR hall, and I have had multiple people ask me about them since. They have a very simple “birdbath optical” design with a Sony 1080p Micro-OLED display. The optical design is very similar to ODG R9 from CES 2017, but lighter and with significantly fewer reflection artifacts as far as I could tell with my brief experience.
With a former Magic Leap person founding the company and system configuration, the comparisons are inevitable. Regarding image quality, Nreal blows away Magic Leap or Hololens with much higher resolution, better contrast, and no field sequential artifacts. The one area where Nreal is lacking is regarding real-world light throughput,
which I roughly measured to be only 10% which is about 30% or about double that of Magic Leap’s 15% and a bit less than Hololens at about 40%, but well behind Lumus Vision’s 1080p and Vuzix’s Blade are about 80% transparent to the real-world.
Plessey and Lumens LED were both showing MicroLED microdisplays on the floor of CES. Many, myself included, believe that MicroLEDs are key to the future of near-eye displays and displays in general. They have many technical advantages over all other types of displays. The question becomes when they will become practical for mass use.
Vuzix had a demo with a fixed image, blue only, Plessey MicroLED display. The projector optics were impressively small and was about ½” by ½” by ¼ in (12mm x 12mm x 6mm). As the demo was quickly put together for the show and Vuzix had not had time to optimize it, I was not allowed to published pictures, but I did have permission to report I had seen it. Vuzix had grafted a tiny display engine that was only about a ½ inch on a side and a quarter inch thick onto Vuzix’s Blade waveguide. It certainly demonstrates the promise of MicroLEDs.
For CES 2019, Lumens was also demonstrating a working green (only) 1080p device. They had the device on a table in their booth showing a live video feed and was able to take a direct picture of the device.
CREAL3D was demonstrating an honest to goodness true light field near-eye display and not the fictitious marketing hype “digital light fields” by the likes of Magic Leap. CREAL3D was hidden in the back of the Swiss pavilion, and ROADTOVR has written a good article including links to some stills and videos. It’s sad we have to add the word “true” in front of “light field” to distinguish CREAL3D and others like Fovi3D (with a direct view light field display) from the charlatans, like Rony Abovitz of Magic Leap, who abuse the meaning of well-defined concepts.
I first learned about CREAL3D at CES 2018 in a private meeting. They had a fascinating but crude green-only demo to prove that it was a true light field display. It was nice to see the progress they have made with a full-color demo this year.
The size of CREAL3D’s demonstration is because it is still on an “optical breadboard” complete with a large metal plate with optics fixtures screwed into it. While I’m not sure whether CREAL3D’s technology can be reduced to a head-worn form factor that they claim and the resolution and other image quality aspects improved using new display devices, it is one of the more technically interesting things I have ever seen in display technology.
WaveOptics had a good demonstration of a series of compact diffractive waveguides using TI-DLP based displays. They showed a 40-degree FOV 720p headset with the display over the eyes and lower resolution 25-degree FOV with the displays in the temples. Both designs were very compact relative to other waveguide designs I have seen.
WaveOptics 40 degree FOV waveguides have been designed into Rokid’s Project Aurora Headset.
WayRay was demonstrating a true hologram mirror automotive HUD display. Once again, I need to emphasize the word “true” to distinguish it from Microsoft’s Hololens “marketing in name only” holograms. I was expecting it to be yet another “Pepper’s ghost” effect but was pleased to find it was a flat hologram acting as a curved color wavelength selectable mirror. Hyundai is planning on incorporating WayRay’s HUD technology into future automobiles. They use a TI-DLP illuminated by lasers to produce a narrow color bandwidth image to make the hologram work well.
Eyelights in the French Pavilion was nice enough to take me for a test drive with their aftermarket HUD for cars. Eyelights has a very bright and ruggedized flat panel display with a simple semi-mirror combiner sheet that sticks to the windshield.
Eyelights’s design is in market contrast to WayRay’s HUD. While Eyelight provides a very good and easy to see image even in most daylight conditions, it requires a large dark patch on the windshield, and which may run afoul some some state’s safety codes that limit blocking the view out the windshield. As the picture above demonstrates, it does not move the focus of the image out into the user’s far vision. WayRay’s use of holograms and a laser illuminated DLP projector supports a much more transparent view of the real world and moves the focus of the HUD image into the drivers far vision.
North’s Focals laser beam scanning glasses that I wrote about a few months back, on the other hand, managed to fall below my worst expectations. Amazon invested in North Focals which is why they were in the Amazon showcase area at CES and garnered a lot of publicity.
North’s image quality was horrible any way you would care to measure it, low resolution, poor contrast, tiny eye box, and a smudge you look through. When asked about brightness, the response was, “we don’t talk about brightness” which I translate to mean “it is not very good.” As I am fond of saying, “missing spec’s are usually ones that are not good.”
It has a tiny eyebox which means you not only have to have the glasses custom fitted, but you must wear them in an exact position or else you see a double image or no image at all. When the display is off, you are left looking through a smudge in your glasses. With a claimed 10-degree field of view, low contrast, and low resolution, a person would be infinitely better off with a smartwatch. North claims they are selling well and will be opening up more custom fit stores soon, let’s just say I am very skeptical. All I can say is, “Amazon, call me first before throwing money at dubious display technology.”
Even though it was my old company, I was surprised to see a demonstration of a true 4K LCOS microdisplay. Once again, I’m using the word “true” to distinguish it from the displays that use a lower pixel count display and then optically shift it. The display has 4K (3840 by 2048 pixel) pixel-mirrors with very small 3.2-micron pixel pitch. They were demonstrating on the floor both near-to-eye optics and in a projector.
Raontech was demonstrating their WQHD (2506×1440) field sequential color LCOS microdisplay. They also made a point to show me that they no longer had the asymmetry I discussed in my article on the Lumus 1080p engine.
Bosch (using TI DLP), ASU (using Syndiant’s LCOS), and Microvision (using their laser beam scanning – see above) were all showing Kitchen Countertop or Speaker on Table projectors. I don’t understand the use model for this type of application. It looks to me like it will only work (and not that well) in contrived demo setups.
All of these systems will have washed out pictures in good lighting and require a surface that can double as a decent projector screen. The typical tabletop or countertop hardly makes a good screen even when new, no less when it has been scratched from use.
I made a special trip to see Microvision at the ShowStoppers event at the Wynn. Unfortunately, Microvision had pulled out of the event that day. The ShowStoppers staff told me that Microvision had tried to set up their demos but claimed that the room lights caused problems and left. I don’t know if it was the amount of light or some other problem, but Microvision decided it was better not to show.
Microvision demos were supposed to be, according to their news release, for an “Interactive Display Engine” and high-resolution Lidar for depth sensing including automobiles. The interactive Display engine consists, according to their marketing material and videos, of a projector shooting onto a tabletop with time of flight (ToF) detection of very simple interaction. This application is inferior in just about every way to a touchscreen both regarding display quality and for interaction/touch-interface. They should stick a fork in this concept because it is long past being a done (see also my comments on similar concepts from Bosch and ASU below)
I don’t follow the Lidar market, so I don’t know how their unit compares to the many other companies working on Lidar with both laser scanning.
Hisense Dual LCD TV looks like it could have a big future Hisense sandwiched at 4K color LCD on top of a 1080p black and white LCD to give an LCD panel OLED like blacks while supporting higher peak brightness using Quantum Dots. While using two LCDs in-series has been theorized for years as a way to give high contrast, Hisense has figured out how to make it. I would not be surprised to see other companies, particularly Samsung with their QLEDs adopting similar methods. For more information, I would suggest the Display Daily article on the Hisense Dual LCD.
Hisense was also one of several companies showing a very short throw “Laser TV.” Compared with the image quality and relatively low cost of LCD and even OLED TVs, the short throw laser projector TV seems to me too little, too late. Hisense was using a high gain and light rejecting screen, which helps but also resulted in hot-spotting and limiting the viewing angle. It might have been a great product 10 years ago.
Lemnis is developing software for vergence/accommodation. They are hardware agnostic and have been working on using eye tracking that controls the focus adjusting hardware. They had a working demonstration in their suite. The demo demonstrates using eye tracking to detect the eye’s vergence and then control a focusing mechanism in the optics.
I plan to follow up with a more detailed articles on some of the products highlighted above. Also as stated above, I am working on a description of what I think the next Hololens is and is not going to be using for their new design expected sometime in 2019, reportedly as early as Mobile World Congress 2019).
I would like to thank Ron Padzensky for reviewing and making corrections to this article.