304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
My next article was going to be about the view through AR devices, but a 1-hour video of Alex Kipman, Microsoft Fellow, discussing the Hololens 2 (HL2) at ETH October, 3rd 2019 (ETH Video) was just released. I decided it was a good time to write about the display information covered in the ETH video along with reporting other information on Hololens and Magic Leap that I have been heard over the last 6 months.
This article follows up on information in my February 2019 article on the HL2. In researching this article, I came across an April 28th, 2019 video by Kipman called, “What improvements have we made on the display engine in HoloLens 2?” (April Video).
Classifying Hololens now as an “enterprise” market product, as Google Glass did before it, is a way of hiding the ball in terms of market success as numbers will be hard to track. Afterall, Hololens 1 only sold about 50K units in it first two years or only 25K Units per year and we only know this because Microsoft reported the numbers. What I didn’t pick up on at the time is that there is a new rationalization for Hololens based on Azure, Microsoft’s new cloud computing platform. It’s not clear to me how a low volume headset is going to drive the need for cloud computing but apparently Microsoft thinks so.
Similarly, Magic Leap seems to have found a gold mine in selling their AR as a driver of 5G cloud computing. In April, Magic Leap announced a $280M Deal with NTT Docomo of Japan as a follow on to deals with AT&T and SK Telecom. I hear that the Magic Leap has raised about $1B (including NTT) with this strategy. Estimates are that Magic Leap has only sold about 5,000 to 7,000 units to date. But then who needs to sell a product when there are gullible telecom executives? Magic Leap needs this kind of money as they have already burned through almost all of their original $2.4B and have a yearly burn rate estimated to be between $600M and $800M per year. That $1B will buy Magic Leap a little more than one year to find another way to raise money.
It is unusual for a large company to make a major announcement of a new product and then to not have it ready to ship more than 7 months later (as of this writing). According to TechRadar, on August 30th, 2019, “Microsoft exec confirms HoloLens 2 is coming in just a few weeks,” but that was over 5 weeks ago. It certainly feels like something is not going to plan.
I have multiple sources that have said that Microsoft is having problems with the laser beam scanning mirror(s). While this is still just a rumor, it could explain why the HL2 has not yet been shipped. There is speculation that small and tightly controlled “shipment” may happen soon for face-saving rather than wide availability. Optics are almost always glued together to maintain precise alignments, and if a mirror failed due to lifetime issues, they would write off at least the whole optical unit for one eye. Shipping a lot of units with a lifetime issue can be disastrous, both financially and the reputation of the product.
Quoting the ETH Video (with my bold emphasis) at 11:37, “This thing we invented with Hololens 2. The technology behind it is called MEMs.” Similar to Kipman’s comments back in February, he is talking like Microsoft invented laser beam scanning, only changed its name to “Mems.” As I pointed out back in February, all the evidence pointed to the HL2 using Microvision technology.
It appears to be a thing with Microsoft to change the name and then act as if they invented it. Thus they call Laser Beam Scanning “MEMs,” 3-D stereo images “Holograms,” and change FOV from a linear to an area measurement (so they can square the change in FOV).
In a thread on the Microvision stock subreddit on Reddit, user “nerdwithoutglasses” posted an image based on a facebook picture identifying the Microvision name in two places. I created the figure below based on this information but using a better image from a still frame of the ETH Video.
Even with a better picture, it is not 100% conclusive, but more like 99%. In the upper right corner, the larger “MicroVision” is covered up by a sticker with “REV: F” on it. Tantalizingly, with the prefix “Micro,” the rest of the word might be Microsoft or MicroVision, but there the top of the “V” is Vision. Perhaps more conclusively in the middle of the photo, there is what appears to be smaller MicroVision that is not covered up where the “V” is clearly visible, but the “ision” is readable.
As I wrote back in February, I don’t understand all the gamesmanship of Microsoft acting like they invented the laser scanning engine. It seems to be hiding in plain sight that they have some relationship with Microvision. My current best theory (no sources) is that Microsoft licensed the technology from Microvision and used Microvision’s electronics for the prototype, but then Microsoft redesigned all the electronics to get it to fit in the headset. Microsoft also has likely redesigned the laser scanning engine and has taken control over manufacturing the laser scanning engine.
For those that don’t know, Microvision is a 26-year-old “startup” that at last look was trading a about $0.70/share with a market cap of about $80M having cumulative losses of over $500M. Since Microsoft has not bought Microvision, they likely found a way to be contractually insulated from Microvision failing or being bought out. It’s also hard to see how the HL2 could be a significant revenue source to Microvision anytime soon with the HL2 targeting low volume enterprise applications. Like Magic Leap, Microvision has a history having more success in selling stock than selling products.
I should add I have a “colorful” history with Microvision having reported on them in this blog for nearly 8 years.
There are at least 3 Kipman videos where he described Mems based laser beam scanning, and listening to any of them is painful for me. I’m sorry, while Mr. Kipman is probably very skilled at something else, but he did not do his homework on Laser Beam Scanning (LBS). He calls Laser the Beam Scanning Systems “MEMs” when MEMs means Microelectromechanical Systems. Both LBS and Texas Instrument’s DLP are display systems that use MEMs devices but work very differently. Once again he is rewriting the dictionary.
In the latest ETH video, he says the fast mirror moves at “12,000 cycles per second” (which he said at two different times). 12,000 cps is ludicrously too slow for the fast Mirror at the resolution HL2 is claiming, something anyone that has any significant knowledge about scanning displays should know (old NTSC television with only 262.5 scan lines per field had a horizontal scanning frequency of 15,750). Below is a transcript from the ETH video.
12:20 Well, we shoot lasers out, red, green, and blue, one photon. And then we hit two mirrors.
He appears to sprinkle the word “photon” everywhere to sound scientific. I wonder how even Microsoft can control individual photons. Continuing:
12:24 First we hit a fast scanning mirror, and by fast scanning I mean it goes at 12,000 Hertz . . . The purpose of that mirror is to take those lasers, just one photon right, and spread them across the horizontal part of the display. And then those lasers hit another mirror; we call it the slow scanning mirror. It only goes at about 120 times per second, which will then squirt those same photons in the vertical. Thus creating a virtual display . . . We are going to build what that display looks like on the back of your eye, which is how you ultimately see a Hologram.
14:12 Mems based technology, three lasers, red, green, and blue. One hits a fast scanning mirror at 12 thousand times a second. It then hits another one at 120 times per second.
Microsoft told the press in February that the fast mirror moves at 54,000, and this number makes some sense. This frequency is still too slow to eliminate flicker (as I explained in detail in February) and support the claimed resolution, but at least it not completely silly. Yet in Kipman’s April Video, he said the fast mirror oscillates at “24,000 cycles per second” which is also silly, but then totally gaffs by saying the slow mirror moves at “6,000 cycles per second” when it more like 120 cycles per second. In the ETH Video, he says the fast mirror is 12,000 cycles per second.
In the category of “fascinating physic,” I would put Kipman’s remarks at 11:45 in the ETH Video, “Lasers are like coherent light. Very digital for formating, it’s like on or off; it’s not a nice bell curve, which is what you get with LEDs.” Between this, the ridiculous numbers, and the single photons, my ears started to bleed. Maybe I’m being too picky, but it seemed like he was trying to bluff his way through the whole section on the display.
As I have written on this blog for many years, I don’t take reports by people without a background in displays as being credible. Well-crafted demos often fool casual observers and even some experts. Companies usually choose to show only demos of content and situations that have been pretested to show the product at its best, and that may hide serious flaws. I have only gotten bits and pieces of reports on the image quality.
I have had reports of noticeable flicker, as I predicted back on February 24th, 2019. The scan mirror speeds are too slow for there not to be flicker, which is a human factors problem that can be serious with many people.
I have heard multiple reports that the image uniformity on the HL2 is as poor as the HL1 due to the Diffractive Waveguide. Interestingly, Hololens has been featuring changing color text in some in their presentations, including the ETH video (see left). Note how the wavy color effect seems to mimic the picture I took of what was supposed to be a white background (lower left). As one person told me, “Since they couldn’t fix the image uniformity, they decided to feature it in their marketing.”
Note also he is still using the “2x” FOV claim that he was criticized for back in February. For everyone else in the industry, FOV is an angular measurement, usually measured in degrees horizontally, vertically or diagonally. To get to 2x, Microsoft changed the definition to mean area. The FOV increase from HL1 to HL2 was about 1.4x as everyone else measures it. It was a marketing-fail in February and Microsoft should drop it.
I have also had reports that it is impossible to read small text suggesting the resolution is much less than what is claimed for by Hololens. Kipman in his April Video stated at 0:12,
“The number that we picked for HoloLens, which is the highest number of any headset out there is 47 pixels per degree of sight. Now, this number is important because this is the number that allows you to read 8-point font on a holographic website.”
First of all, there are several headsets with about the same or higher than 47 pixels per degree, so the claim is wrong. Second of all, the “reading and 8-point font in a holographic website” is meaningless as one could zoom-in by moving closer to a virtual screen. The font size should be expressed in degrees or other terms that lock in the size in terms of an angle. For Microsoft’s 8-point Arial font, a letter “T” is 8 pixels high (plus spacing). So what we want to know is whether one can read the letter “T” when it is 1/(47/8) = ~0.17 degrees.
As I wrote back in February, laser scanning proponents tend to talk in terms of scanning resolution, but the delivered effective resolution is typically about ½ that claimed due to the scanning process. Both my previous analysis of LBS systems and the reports I am receiving suggest this is the case.
The one surprise so far is that I have not yet heard reports any laser speckle issues which would be expected. Several people that should know have reported that while the image uniformity was poor and the text was hard to read, they did no see noticeable speckle.
At about 43 minutes in the ETH video, there was some interesting information the headset design on human factors, in particular, the way they show the variations in head size, forehead to eye distance, and skull shape (see figures on the left). These factors are a significant challenge for any headset designer. Additionally, there are wide variances in the interpupillary distance (IPD) that don’t necessarily tie directly to the head size (you can have a large head with a narrow IPD, for example).
Kipman points out that these were very challenging to make a single HL2 fit about 95% of all people. One area where the HL1 is impressive is its adaptability to various head sizes while giving good eye relief to allow people to wear most normal glasses. By comparison, Magic Leap One has very little eye relief. Thus it is practical to have applications that involve sharing with the HL1and HL2 wheres it is not possible with Magic Leap. While the image quality is poor on both Hololens and Magic Leap, Hololens is much better thought through as a “product.”
Kipman talks about the camera and the HL2’s SLAM (with more heavy use of the word “photons”), how that captured audio, and there is a “marketing message” about Azure cloud computing. Perhaps the most interesting part of the talk comes at about 49 minutes into the video, when he talks about thermal management including a time-lapse thermal video.
I have always felt that Hololens is a classic example of something that has “Escaped from the lab” a concept I defined back in 2012 that seems to fit Hololens 1 and 2:
“Escaped from the lab” – This is the demonstration of a product concept that is highly impractical for any of a number of reasons including cost, lifetime/reliability, size, unrealistic setting (for example requires a special room that few could afford), and dangerous without skilled supervision. Sometimes demos “escape from the lab” because a company’s management has sunk a lot of money into a project, and a public demo is an attempt to prove to management that the concepts will at least one day appeal to consumers.
Hololens 1 got off on the wrong foot with me by perverting the word “Hologram.” They didn’t improve matters with the HL2 by saying they had 2X the FOV when, by the conventional way FOV is specified, they only had about 1.4X improvement in FOV. Microsoft also made many beginner-type mistakes with the ergonomics and user interface on HL1 that they are looking to address with the HL2.
I want to disclose that I am working as Chief Science officer for RAVN, a company working on AR Headsets for military and first responder applications. Since Microsoft Hololens secured a large contract with the U.S. Army, they could be considered a competitor to RAVN. I started covering Hololens several years before my involvement with RAVN. The views expressed in this article represent my own opinion and analysis and not that of RAVN.
I want to add, RAVN’s founder and CEO is a former Navy Seal with several deployments in Iraq and Afghanistan. He comes with a sense of what a soldier in the field would wear as opposed to what someone in an R&D lab theorizes they should wear.