304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
First off, this post is a few weeks late. I got sick on returning from CES and then got busy with some other pressing activities.
At left is a picture that caught me next to the Lumus Maximus demo at CES from Imagineality’s “CES 2017: Top 6 AR Tech Innovations“. Unfortunately they missed that in the Lumus booth at about the same time was a person from Magic Leap and Microsoft’s Hololens (it turned out we all knew each other from prior associations).
Among Imagineality’s top 6 “AR Innovations” were ODG’s R-8/R-9 Glasses (#1) and Lumus’s Maximus 55 degree FOV waveguide (#3). From what I heard at CES and saw in the writeups, ODG and Lumus did garner a lot of attention. But by necessity, theses type of lists are pretty shallow in their evaluations and I try to do on this blog is go a bit deeper into the technology and how it applies to the market.
Among the near eye display companies I looked at during CES include Lumus, ODG, Vuzix, Real Wear, Kopin, Wave Optics, Syndiant, Cremotech, QD Laser, Blaze (division of eMagin) plus several companies I met with privately. As interesting to me as their technologies was there different takes on the market.
For this article, I am mostly going to focus on the Industrial / Enterprise market. This is were most of the AR products are shipping today. In future articles, I plan to go into other markets and more of a deep dive on the the technology.
I have had an number of people asked me what was the best or most interesting AR thing I saw at CES 2017, and I realized that this was at best an incomplete question. You first need to ask, “What problem are they trying to solve?” Which leads to “how well does it solve that problem?” and “how big is that market?“
One big takeaway I had at CES having talked to a number of different company’s is that the various headset designs were, intentionally or not, often aimed at very different applications and use cases. Its pretty hard to compare a headset that almost totally blocks a user’s forward view but with a high resolution display to one that is a lightweight information device that is highly see-through but with a low resolution image.
AR means a lot of different things to different people. In talking to a number of companies, you found they were worried about different issues. Broadly you can separate into two classes:
For most of the companies were focused on industrial / enterprise / business uses at least for the near future and in this market the issues include:
For all the talk about mixed reality (ala Hololens and Magic Leap), most of the companies selling product today are focused on helping people “do a job.” This is where they see the biggest market for AR today. It will be “boring” to the people wanting the “world of the future” mixed reality being promised by Hololens and Magic Leap.
You have to step back and look at the market these companies are trying to serve. There are people working on a factory floor or maybe driving a truck where it would be dangerous to obscure a person’s vision of the real world. They want 85% or more transparency, very lightweight and highly comfortable so it can be worn for 8 hours straight, and almost no blocking of peripheral vision. If they want to fan out to a large market, they have to be cost effective which generally means they have to cost less than $1,000.
To meet the market requirements, they sacrifice field of view and image quality. In fact, they often want a narrow FOV so it does not interfere with the user’s normal vision. They are not trying to watch movies or play video games, they are trying to give necessary information for person doing a job than then get out of the way.
I am often a hard audience. I’m not interested in the marketing spiel, I’m looking for what is the target market/application and what are the facts and figure and how is it being done. I wanting to measure things when the demos in the boths are all about trying to dazzle the audience.
As a case in point, let’s take ODG’s R-9 headset, most people were impressed with the image quality from ODG’s optics with a 1080p OLED display, which was reasonably good (they still had some serious image problems caused by their optics that I will get into in future articles).
But what struck me was how dark the see-through/real world was when viewed in the demos. From what I could calculate, they are blocking about 95% of the real world light in the demos. They also are too heavy and block too much of a person’s vision compared to other products; in short they are at best going after a totally different market.
Vuzix is representative of the companies focused on industrial / enterprise applications. They are using with waveguides with about 87% transparency (although they often tint it or uses photochromic light sensitive tinting). Also the locate the image toward the outside of the use’s view so that even when an image it displayed (note in the image below-right that the exit port of the waveguide is on the outside and not in the center as it would be on say a Hololens).
The images at right were captured from a Robert Scoble interview with Paul Travers, CEO of Vuzix. BTW, the first ten minutes of the video are relatively interesting on how Vuzix waveguides work but after that there is a bunch of what I consider silly future talk and flights of fancy that I would take issue with. This video shows the “raw waveguides” and how they work.
Another approach to this category is Realwear. They have a “look-over” display that is not see through but their whole design is make to not block the rest of the users forward vision. The display is on a hinge so it can be totally swung out of the way when not in use.
What drew the attention of most of the media coverage of AR at CES was how “sexy” the technology was and this usually meant FOV, resolution, and image quality. But the companies that were actually selling products were more focused on their user’s needs which often don’t line up with what gets the most press and awards.
Carl, I think that a Chinese company already cracked the code for naked eye 3d display, see 4:32. this seems better than magic leap.
can you comment on it?
It is just a reflection off glass/shinny-surface/mirror. Nothing special or 3-D.
check the video, it is a real image not virtual, look at the gaps relative location when the video shooting
Is R-9 also based on beam splitter idea?
Yes, the ODG R-9 uses what is commonly known as a “birdbath” optical configuration (there are several variations). In the case of the R-9 they have a beamsplitter and a curve optical combiner. The user looks though both the beamsplitter and the curved combiner.
Here is a link to a a schematic of their optical configuration: https://www.kguttag.com/wp-content/uploads/2017/01/ODG-9494800-Fig-169.png
I came from this reddit post:
and I was wondering if you came away from CES with a clearer picture as to how the Companies that you mention (Lumus, ODG, Vuzix, etc) are sourcing their tech from? I.e. are companies going with LBS? Going with the MEMS mirrors or the FSD?
Or are they going some other route, like towards DLP or LCOS?
As a follow up, and to be more concise, is there a company out there that you think is (or is close to) being able to be that OEM for larger companies that source their baseline AR tech from. I.e. microvision (although I think I already know your opinion on them) himax, etc.
Mostly I see AR companies retreating from very high volume consumer applications with words like “focusing on the enterprise and business” as happened to Google Glass and more recently Hololens. The companies making LCOS and DLP which support the brightness necessary for AR have plenty of volume. Frankly things like Hololens are both overkill and block a person’s normal vision way too much for most business/enterprise applications and the image quality is too poor for watching entertainment videos. People are projecting their unrealistic wishes on AR based on things they may have seen in movies (done with digital compositing and not are not “real”).
The only LBS headset I saw at CES was that QD-Laser’s prototype. Nobody building a volume product I know of is even consider it. The one interesting aspect is that you can use without vision correction (glasses), but there are big drawbacks in image quality, cost, and power consumption. I also wonder about the safety issues, particularly if they ever wanted to try for see through applications.
For AR most companies are using LCOS with a few using DLP. ODG is using Micro-OLEDs in a so-called “see through” AR headset but the OLED has such low lumens that they have to block most of the ambient light and thus only barely qualifies as “see-through.”
AR/see-through need 10 to 100 times the brightness (nits = candelas/meter-squared). To be really see-through you want on the order of 90% transparency which in turn means you are throwing away sometimes more than 90% of the image light. The highly transparent products are essentially building pico projectors to feed the optics. Right now the only technologies that can go there in a small form factor are LCOS and DLP. Consider that for outdoor headsets they want over 3,000 nits and if you throw away 90% for transparency, you need to start with 30,000 nits. Micro-OLEDs that are in production have typically on 200 nits or a factor of 150 less.
The VR headsets are dominated by flat panel (cell phone sized) OLED and this will likely continue. Kopin is going to try and compete with Micro-OLEDs to make a smaller yet high resolution headset.
my takeaway considering your comments in totality would make me think that the only viable tech that will be mainstream for a while will be those LCOS tech companies. Do you see LCOS companies going towards that Micro-OLEDs to improve their product offering going forward? And again, to clarify these companies are marketing with a B2B strategy, correct?
“The only LBS headset I saw at CES was that QD-Laser’s prototype. Nobody building a volume product I know of is even consider it. The one interesting aspect is that you can use without vision correction (glasses), but there are big drawbacks in image quality, cost, and power consumption. I also wonder about the safety issues, particularly if they ever wanted to try for see through applications.”
This HMD is aimed at the approximate 250 million people who suffer from low vision. Of all the AR devices you saw at CES 2017, this is arguably the one that solves what is a great problem for many. It seems to be coming to market in March 2017.
Yeah yeah, “aimed” at 250 million, the Microvision theme song should be “Tomorrow” (see: https://youtu.be/s6ftQ-99F7k?t=118 )
The question is how many will it “hit” this time. I saw eSight and I have seen EverGaze near eye displays for people with vision issues and they both have built products that the people with vision impairment find very helpful and are excited to use. I don’t know if that will be true for a Laser Scanning near eye display (yet to be seen/proven).
As for the broader near eye market, there seems to be near zero interest in LBS at this time.
I should add, I have even recommended to eSight and EverGaze that they LOOK AT LBS because of it unique attributes. But for now, they seem to feel the disadvantages outweigh the advantages of LBS in their application.
I would be concerned about the flicker with LBS as people with vision issues may also have flicker sensitivity.
Did you look at the new eSight 3 which uses an OLED micro display ?
Version 1 used an eMagin 800 x 600 :
The display uses a pair of 0.5-inch OLED micro-displays coupled to wide field prisms (both supplied by eMagin), offering a sharp 800x 600 image with tolerance of differing user’s pupil distances.
The eSight Version 3 FCC filing of 10/31/16 has a User Guide which indicates a 1024 x 768 micro display .
The timing of the eSight Version 3 corresponds with the timing of eMagin’s new XGA release .
From eMagin 3Q 10Q filing :
We are completing qualification of a new 0.48 inch diagonal full color XGA format microdisplay utilizing the same proven 9.6-micron color pixel used in its WUXGA and SXGA096 product lines. This new product is targeted at industrial and commercial markets looking for a cost effective medium resolution microdisplay. Deliveries are scheduled to begin in the first quarter of 2017.
I looked and asked around a lot about OLED microdisplays. I think everyone these days is using Sony’s OLEDs and almost no one is using eMagin except for the Blaze night vision goggles, but Blaze is a division of eMagin. With Sony’s size and their volume market in Micro-OLED for camera viewfinders they have a huge cost and probably reliability advantage over the smaller players. By all the reports I get (I have sent people to check out eMagin in the past), they are difficult and expensive to work with which makes it that much easier for Sony.
MicroVision? Is the QD Laser Retissa covered by MicroVision’s parents?
On November 2, 2016, MicroVision CEO Tokman stated the following during a conference call:
“As we look to expand our customer relationships and extend our technology to applications beyond Pico projection, we are very excited to announce that we have signed two agreements in the autonomous vehicle and augmented reality spaces with world leading technology companies.
Under one of these contracts, we will deliver a proof of concept prototype, our 3D sensing solution for Advanced Driver Assistance Systems for autonomous vehicle. Under the second contract, we will deliver a proof of concept prototype display for augmented reality application. The combined value of these contracts is nearly $1 million and both are expected to be completed in 2017. ”
On November 14, 2016, Mark Gurman broke the following about Apple AR.
“The company has ordered small quantities of near-eye displays from one supplier for testing, the people said. Apple hasn’t ordered enough components so far to indicate imminent mass-production, one of the people added.”
Sorry, but you are at best delusional if you think Apple is going to be using Microvision for near eye displays in a real product.
I would suggest you look at Microvision CEO Tokman’s track record. You might want to start with his 2010 statements such as:
“Given the current progress and estimates from makers of the green laser — a key component of Microvision projectors — an embedded device would be commercially viable between end-2011 and mid-2012, Microvision CEO Alexander Tokman said in an interview.”
“”The good news is that there are about five global players now that are developing and will commercialize direct green laser by late-2011 to mid-2012,” Tokman said.”
He was wrong then. However he was spot on about the following.
From the same conference call:
“A final point that we’ve been getting a lot of inquiries about is laser safety, in particular on selling products with Sony engine in Europe. Using laser safety to combat LBS technology is an old tactic from our competition. The standards are very clear and we believe selling products with Sony LBS engine in Europe is permissible under current standards as long as the guidelines on marketing for children are observed and the products are properly labeled.”
Could it not be said that his statement during the conference call was in response to statements like the following?
“I’m saying that there is a zero chance of an embedded Class 3R laser product in a first world country.”
“This is why you will not see a Class 3R and likely not even Class 2 laser product in any cell phone in the 1st world. ”
After you made these statements, the Fusion smartphone was released in France.
Was your assessment wrong?
What part of “The standards are very clear and we believe selling products with Sony LBS engine in Europe is permissible under current standards as long as the guidelines on marketing for children are observed and the products are properly labeled” don’t you understand is problematical to selling a product to consumers in Europe?
What part of “I’m saying that there is a zero chance of an embedded Class 3R laser product in a first world country.” don’t you understand? Sony it not selling and embedded product and Fusion is not a major company. We will see how Fusion does, and if actions are taken against them.
Get back to us when Microvision sells enough to make a profit and not some low volume niche product.
[…] actually a snapshot in time for Lumus during our last CES in early January. As Karl Guttag in a recent blog post – Magic Leap, HoloLens and others were simultaneously visiting the Lumus booth at the same time […]
[…] image quality of Lumus compared to the diffraction waveguides when I visited their booth at CES in 2017 and […]
[…] may remember Lumus’s first demonstration of the “Maximus” concept back at CES 2017. Lumus first pioneered 1-D expanding waveguides in the year 2000. But 1-D expanding waveguides […]