Hololens 2 Video with Microvision “Easter Egg” Plus Some Hololens and Magic Leap Rumors

Introduction

My next article was going to be about the view through AR devices, but a 1-hour video of Alex Kipman, Microsoft Fellow, discussing the Hololens 2 (HL2) at ETH October, 3rd 2019 (ETH Video) was just released. I decided it was a good time to write about the display information covered in the ETH video along with reporting other information on Hololens and Magic Leap that I have been heard over the last 6 months.

This article follows up on information in my February 2019 article on the HL2. In researching this article, I came across an April 28th, 2019 video by Kipman called, “What improvements have we made on the display engine in HoloLens 2?” (April Video).

AR Cloud Computing Means Billions of Financing Dollars to Hololens and Magic Leap

Classifying Hololens now as an “enterprise” market product, as Google Glass did before it, is a way of hiding the ball in terms of market success as numbers will be hard to track. Afterall, Hololens 1 only sold about 50K units in it first two years or only 25K Units per year and we only know this because Microsoft reported the numbers.  What I didn’t pick up on at the time is that there is a new rationalization for Hololens based on Azure, Microsoft’s new cloud computing platform.  It’s not clear to me how a low volume headset is going to drive the need for cloud computing but apparently Microsoft thinks so.

Similarly, Magic Leap seems to have found a gold mine in selling their AR as a driver of 5G cloud computing. In April, Magic Leap announced a $280M Deal with NTT Docomo of Japan as a follow on to deals with AT&T and SK Telecom. I hear that the Magic Leap has raised about $1B  (including NTT) with this strategy. Estimates are that Magic Leap has only sold about 5,000 to 7,000 units to date. But then who needs to sell a product when there are gullible telecom executives?  Magic Leap needs this kind of money as they have already burned through almost all of their original $2.4B and have a yearly burn rate estimated to be between $600M and $800M per year. That $1B will buy Magic Leap a little more than one year to find another way to raise money.

Mostly Silence from Microsoft Since Announcing Hololens 2 (HL2) in February 2019

It is unusual for a large company to make a major announcement of a new product and then to not have it ready to ship more than 7 months later (as of this writing). According to TechRadar, on August 30th, 2019, “Microsoft exec confirms HoloLens 2 is coming in just a few weeks,” but that was over 5 weeks ago. It certainly feels like something is not going to plan.

HL2 Rumors of Mirror Problems

I have multiple sources that have said that Microsoft is having problems with the laser beam scanning mirror(s). While this is still just a rumor, it could explain why the HL2 has not yet been shipped. There is speculation that small and tightly controlled “shipment” may happen soon for face-saving rather than wide availability. Optics are almost always glued together to maintain precise alignments, and if a mirror failed due to lifetime issues, they would write off at least the whole optical unit for one eye. Shipping a lot of units with a lifetime issue can be disastrous, both financially and the reputation of the product.

No Direct Love Again for Microvision, But Gives Them a Visual “Easter Egg”

Quoting the ETH Video (with my bold emphasis) at 11:37, “This thing we invented with Hololens 2. The technology behind it is called MEMs.” Similar to Kipman’s comments back in February, he is talking like Microsoft invented laser beam scanning, only changed its name to “Mems.” As I pointed out back in February, all the evidence pointed to the HL2 using Microvision technology.

It appears to be a thing with Microsoft to change the name and then act as if they invented it. Thus they call Laser Beam Scanning “MEMs,” 3-D stereo images “Holograms,” and change FOV from a linear to an area measurement (so they can square the change in FOV).

In a thread on the Microvision stock subreddit on Reddit, user nerdwithoutglasses” posted an image based on a facebook picture identifying the Microvision name in two places. I created the figure below based on this information but using a better image from a still frame of the ETH Video.

Even with a better picture, it is not 100% conclusive, but more like 99%. In the upper right corner, the larger “MicroVision” is covered up by a sticker with “REV: F” on it. Tantalizingly, with the prefix “Micro,” the rest of the word might be Microsoft or MicroVision, but there the top of the “V” is Vision. Perhaps more conclusively in the middle of the photo, there is what appears to be smaller MicroVision that is not covered up where the “V” is clearly visible, but the “ision” is readable.

As I wrote back in February, I don’t understand all the gamesmanship of Microsoft acting like they invented the laser scanning engine. It seems to be hiding in plain sight that they have some relationship with Microvision. My current best theory (no sources) is that Microsoft licensed the technology from Microvision and used Microvision’s electronics for the prototype, but then Microsoft redesigned all the electronics to get it to fit in the headset. Microsoft also has likely redesigned the laser scanning engine and has taken control over manufacturing the laser scanning engine.

For those that don’t know, Microvision is a 26-year-old “startup” that at last look was trading a about $0.70/share with a market cap of about $80M having cumulative losses of over $500M. Since Microsoft has not bought Microvision, they likely found a way to be contractually insulated from Microvision failing or being bought out.  It’s also hard to see how the HL2 could be a significant revenue source to Microvision anytime soon with the HL2 targeting low volume enterprise applications. Like Magic Leap, Microvision has a history having more success in selling stock than selling products.

I should add I have a “colorful” history with Microvision having reported on them in this blog for nearly 8 years.

Faking It When Describing Laser Beam Scanning

There are at least 3 Kipman videos where he described Mems based laser beam scanning, and listening to any of them is painful for me. I’m sorry, while Mr. Kipman is probably very skilled at something else, but he did not do his homework on Laser Beam Scanning (LBS). He calls Laser the Beam Scanning Systems “MEMs” when MEMs means Microelectromechanical Systems. Both LBS and Texas Instrument’s DLP are display systems that use MEMs devices but work very differently. Once again he is rewriting the dictionary.

In the latest ETH video, he says the fast mirror moves at “12,000 cycles per second” (which he said at two different times). 12,000 cps is ludicrously too slow for the fast Mirror at the resolution HL2 is claiming, something anyone that has any significant knowledge about scanning displays should know (old NTSC television with only 262.5 scan lines per field had a horizontal scanning frequency of 15,750). Below is a transcript from the ETH video.

12:20 Well, we shoot lasers out, red, green, and blue, one photon. And then we hit two mirrors.

He appears to sprinkle the word “photon” everywhere to sound scientific. I wonder how even Microsoft can control individual photons. Continuing:

12:24 First we hit a fast scanning mirror, and by fast scanning I mean it goes at 12,000 Hertz . . . The purpose of that mirror is to take those lasers, just one photon right, and spread them across the horizontal part of the display. And then those lasers hit another mirror; we call it the slow scanning mirror. It only goes at about 120 times per second, which will then squirt those same photons in the vertical. Thus creating a virtual display . . . We are going to build what that display looks like on the back of your eye, which is how you ultimately see a Hologram.

14:12  Mems based technology, three lasers, red, green, and blue. One hits a fast scanning mirror at 12 thousand times a second. It then hits another one at 120 times per second.

Microsoft told the press in February that the fast mirror moves at 54,000, and this number makes some sense. This frequency is still too slow to eliminate flicker (as I explained in detail in February) and support the claimed resolution, but at least it not completely silly. Yet in Kipman’s April Video, he said the fast mirror oscillates at “24,000 cycles per second” which is also silly, but then totally gaffs by saying the slow mirror moves at “6,000 cycles per second” when it more like 120 cycles per second. In the ETH Video, he says the fast mirror is 12,000 cycles per second.

In the category of “fascinating physic,” I would put Kipman’s remarks at 11:45 in the ETH Video, “Lasers are like coherent light. Very digital for formating, it’s like on or off; it’s not a nice bell curve, which is what you get with LEDs.” Between this, the ridiculous numbers, and the single photons, my ears started to bleed. Maybe I’m being too picky, but it seemed like he was trying to bluff his way through the whole section on the display.

HL2 Image Quality

As I have written on this blog for many years, I don’t take reports by people without a background in displays as being credible. Well-crafted demos often fool casual observers and even some experts.  Companies usually choose to show only demos of content and situations that have been pretested to show the product at its best, and that may hide serious flaws. I have only gotten bits and pieces of reports on the image quality.

I have had reports of noticeable flicker, as I predicted back on February 24th, 2019. The scan mirror speeds are too slow for there not to be flicker, which is a human factors problem that can be serious with many people.

I have heard multiple reports that the image uniformity on the HL2 is as poor as the HL1 due to the Diffractive Waveguide. Interestingly, Hololens has been featuring changing color text in some in their presentations, including the ETH video (see left). Note how the wavy color effect seems to mimic the picture I took of what was supposed to be a white background (lower left). As one person told me, “Since they couldn’t fix the image uniformity, they decided to feature it in their marketing.”

Note also he is still using the “2x” FOV claim that he was criticized for back in February. For everyone else in the industry, FOV is an angular measurement, usually measured in degrees horizontally, vertically or diagonally. To get to 2x, Microsoft changed the definition to mean area. The FOV increase from HL1 to HL2 was about 1.4x as everyone else measures it. It was a marketing-fail in February and Microsoft should drop it.

I have also had reports that it is impossible to read small text suggesting the resolution is much less than what is claimed for by Hololens. Kipman in his April Video stated at 0:12,

The number that we picked for HoloLens, which is the highest number of any headset out there is 47 pixels per degree of sight. Now, this number is important because this is the number that allows you to read 8-point font on a holographic website.”

First of all, there are several headsets with about the same or higher than 47 pixels per degree, so the claim is wrong. Second of all, the “reading and 8-point font in a holographic website” is meaningless as one could zoom-in by moving closer to a virtual screen. The font size should be expressed in degrees or other terms that lock in the size in terms of an angle. For Microsoft’s 8-point Arial font, a letter “T” is 8 pixels high (plus spacing). So what we want to know is whether one can read the letter “T” when it is  1/(47/8) = ~0.17 degrees.

As I wrote back in February, laser scanning proponents tend to talk in terms of scanning resolution, but the delivered effective resolution is typically about ½ that claimed due to the scanning process. Both my previous analysis of LBS systems and the reports I am receiving suggest this is the case.

No Reports of Laser Speckle

The one surprise so far is that I have not yet heard reports any laser speckle issues which would be expected. Several people that should know have reported that while the image uniformity was poor and the text was hard to read, they did no see noticeable speckle.

Interesting Human Factor and Thermal Management Information

At about 43 minutes in the ETH video, there was some interesting information the headset design on human factors, in particular, the way they show the variations in head size, forehead to eye distance, and skull shape (see figures on the left). These factors are a significant challenge for any headset designer. Additionally, there are wide variances in the interpupillary distance (IPD) that don’t necessarily tie directly to the head size (you can have a large head with a narrow IPD, for example).

Kipman points out that these were very challenging to make a single HL2 fit about 95% of all people. One area where the HL1 is impressive is its adaptability to various head sizes while giving good eye relief to allow people to wear most normal glasses. By comparison, Magic Leap One has very little eye relief. Thus it is practical to have applications that involve sharing with the HL1and HL2 wheres it is not possible with Magic Leap. While the image quality is poor on both Hololens and Magic Leap, Hololens is much better thought through as a “product.”

Other Things in the ETH Video

Kipman talks about the camera and the HL2’s SLAM (with more heavy use of the word “photons”), how that captured audio, and there is a “marketing message” about Azure cloud computing. Perhaps the most interesting part of the talk comes at about 49 minutes into the video, when he talks about thermal management including a time-lapse thermal video.

Hololens Escaped From the Lab

I have always felt that Hololens is a classic example of something that has “Escaped from the lab” a concept I defined back in 2012 that seems to fit Hololens 1 and 2:

“Escaped from the lab” – This is the demonstration of a product concept that is highly impractical for any of a number of reasons including cost, lifetime/reliability, size, unrealistic setting (for example requires a special room that few could afford), and dangerous without skilled supervision.  Sometimes demos “escape from the lab” because a company’s management has sunk a lot of money into a project, and a public demo is an attempt to prove to management that the concepts will at least one day appeal to consumers.

Hololens 1 got off on the wrong foot with me by perverting the word “Hologram.” They didn’t improve matters with the HL2 by saying they had 2X the FOV when, by the conventional way FOV is specified, they only had about 1.4X improvement in FOV. Microsoft also made many beginner-type mistakes with the ergonomics and user interface on HL1 that they are looking to address with the HL2.

Disclosure

I want to disclose that I am working as Chief Science officer for RAVN, a company working on AR Headsets for military and first responder applications. Since Microsoft Hololens secured a large contract with the U.S. Army, they could be considered a competitor to RAVN. I started covering Hololens several years before my involvement with RAVN. The views expressed in this article represent my own opinion and analysis and not that of RAVN.

I want to add, RAVN’s founder and CEO is a former Navy Seal with several deployments in Iraq and Afghanistan. He comes with a sense of what a soldier in the field would wear as opposed to what someone in an R&D lab theorizes they should wear.

Karl Guttag
Karl Guttag
Articles: 256

28 Comments

  1. I did all the hololens 2 demos at the barcelona mwc event in February. I also participated at the redmond microsoft build event in May which included a workshop on the hololens 2 and we were able to work on the hololens for few hours. So I’ve tried in total about 9 different devices and there was always an issue with image for me when I compared it to the Hololens 1 display. I’ve been using the hololens 1 on a daily basis for the last 3 years, so the difference was instant. I can only explain it as the image being not too crisp and some sort of flickering ( like seeing black lines on an old tv). I don’t know about laser speckle. As for the release, I believe there will be an announcement in November 4-8 during the Microsoft Ignite event. The last we heard is that the delay was due to FCC approval, but who knows, it could be a production issue…

    • Thanks for the report on what you have seen.

      Based on the reported 54,000 cycles per second and the resolution, it seems obvious from the numbers I worked back in February (https://www.kguttag.com/2019/02/27/hololens-2-first-impressions-good-ergonomics-but-the-lbs-resolution-math-fails/) that there will be flicker. Literally the day I put up my flicker analysis, I got someone reporting flicker. There is a wide variance between people on how sensitive they are to flicker. I know of people that cannot be in the same room with a laser scanning projector because the flicker drives them crazy. I’m actually a bit surprised they went with a display with flicker. Back in the mid-1990’s Europe put limits on the amount of flicker you could have from a computer monitor at 75Hz non-interlaced. Based on the available information on the HL2, it is 120Hz interlaced and will have a significant amount of 60Hz energy.

      I’m curious if you had any experience reading small text characters as you would see on a computer monitor or cell phone. The reports I have so far is that small text is unreadable.

  2. Thanks for your interpretations of Kidman’s talk relating to the applied LBS technology in HL2, Karl.

    As the former CTO of Syndiant, developing the competing LCoS technology, what is your opinion on why Microsoft put aside LCoS (HL1) and switched to LBS in HL2?

    Thanks and regards

    Timothy

    Reference:
    Alex Kipman (MSFT) @ ETH 3/10:
    11:45: “We moved away from Hololens1 where we had LEDs with an LCoS base system (…).”

    • For the record, I have been out of LCOS for almost 8 years now and the system design I did was at Navdy and I picked DLP based on the application requirement. My sense is that the long terms technology for displays will be MicrLEDs but they are not ready yet.

      I don’t know why but I can only speculate. I think I heard them say at one time that it was about brightness, but then Hololens 2 is not much brighter than Hololens 1 with LCOS.

      I suspect that it is a combination of, “the grass is always greener on the other side” and maybe a bunch of researches with too much money. The field sequential LCOS displays used in HL1 are far from perfect and we will find out the HL2 LBS display will be far from perfect if and when HL2 ships.

      I think the flicker and lack of resolution could be fatal flaws to HL2 even if and when they fix the reliability problems.

      I’m also concerned about the long term exposure of laser beam scanning into the eye. Safety standards for lasers are based on accidentally hitting the eye with a scanning laser and not long term exposure where the beam is continuously aimed into the eye. A scanning laser beam at any instant in time is going to be several million times brighter than an area display (all the light energy of the display is concentrated into an area smaller than one pixel). This is an honest concern and I have not been able to find any studies pro or con on this issue.

      There have been many attempts at making LBS systems over the last 20 years (including big names like Samsung and Hitachi). LBS may seem new to some because past attempts failed so badly. LBS looks good on paper and sounds simple, but the devil is in the details.

      • So your expert opinion is you have no idea other than they have a lot of money and thought LBS looked better from a distance?

        At least MSFT knows why they went with LBS. Here is MSFT’s Zulfi Alam explanation in May:

      • Thanks for the reference. I was asked why they switch which requires a lot of conjecture on my part as I don’t know their motives. I did miss the valid point higher contrast. I know of about 25 years of history of various companies chasing the pot of gold at the end of the rainbow that has been laser beam scanning thus far. LBS has always worked better in theory than in practice.

        At least Zulfi Alam knew that the high-speed mirror moves at 54,000 cycles per second, something Alex Kipman has not been able to remember. Like Alex Kipman, he talks like Microsoft invented themselves with no credit to Microvision.

        His claims for FOV, resolution, and brightness, at least so far, have proven to be bogus. What limits the FOV is the waveguide’s TIR, that is why they went with the “butterfly” waveguide. Companies have gotten much brighter than the 1,000 nits with both LCOS and DLP with waveguides. There are multiple reports, including one Reda in the comments below, that the resolution is not very good. I’m looking forward to testing it with 8-point fonts. As I tried to explain back in February, 54,000KHz is too slow a fast scan rate to support their claimed resolution (see: https://www.kguttag.com/2019/02/27/hololens-2-first-impressions-good-ergonomics-but-the-lbs-resolution-math-fails/).

        As I also pointed out it based on the scanning frequency numbers in February and as confirmed by others, the image flickers. This is a major flaw as I think we will see. It’s not a simple matter of bending the mirror further and making it go faster, it is a microscopic electromechanical system beating at a resonate frequency trying to scan the laser repeatedly at very high precision. To support the resolution they claim and without flicker, they need the mirror to go about 4X faster than it does, that is not a simple task.

        There is validity to Zufi Alam’s claim that laser scanning generally will have higher contrast than field sequential or color filter LCOS as well as DLP, at least at the device level. Except for a black screen, it usually is not the “infinite contrast” that is claimed. You also have to consider other optical issues.

        Maybe, I am being a just a little harsh on Microsoft, and if I am, it is because they are fibbing about some things that lead me to think that they are fibbing about other things.

        We will then have to see if Microsoft and its silent (gagged?) partners can make LBS in volume where others (no less than Samsung, Hitachi, and way back ATT among others) have failed thus far. If it were easy, one would be able to buy one today and test it, and a lot of money has been lost trying to slay the LBS dragon.

      • David, you shouldn’t take that guy seriously. Zulfi Alam was the executive responsible for the Microsoft Health Band, a product that was soooo bad it was discontinued by the company. Do you believe the same guy who was supposed to know about health and biometrics now is also an expert in optics?

  3. Excellent article Karl,
    I hope you don’t mind but I posted it to LinkedIn.
    I’ve been trying to get my hands on a Holoens2 for over a month now. Aside from filling in the order slip on their website, ther’s been no other way of contacting anyone. I’ve manged to get through to a sales rep, who promised to get back to me within three days witrh some questions I asked. There’s been no sign of her again..
    I really hope that’s it’s a supply chain problem rather than a manufacturing one. HoloLens 2 is by no means perfect, but there do seem to very few contenders on the market that we can turn to if it doesn’t work..

  4. Thanks Karl!! So great!! I’m confused about how the beam is expanded when passes the second mirror. How is the two mirrors’ surfaces? Glad to hear your answer~

    • I have uploaded a diagram of the Hololens 2 laser beam scanning (see link below). I have identified the fast and slow scanning mirrors. Due to the much higher speed (54,000 Hz), the fast scanning mirror is also smaller (the smaller the mass the faster it can move and change direction). The slow scanning mirror is larger so it can reflect the horizontally scanned beam. It is comparatively easy to build a larger mirror because it only has to move at 120 Hz. There are some other mirrors used as part of despeckling and other conditioning of the scanned beam and perhaps some image distortion correction.

      https://www.kguttag.com/wp-content/uploads/2019/10/Hololens-2-LBS-Projector-Diagram-001.jpg

  5. Excellent article Karl,

    1)about the fast scanning frequency:
    HL2 use 2160×1440@120fps. I guess the fast scanning mirror is actually physically 1080×1440@60fps performence with 0.8 scanning pattern effect factor, in math is 720×60/0.8=54,000 Hz .
    Here 720 is because the both lines in left-right and right-left direction in a single scanning cycle, are been used for display, so it comes to 1440 lines. I guess the officially called 120fps is not the true frame rate, but the binocular overlaped area FPS. The frame frequency is still 60Hz… About how the 1080p (or a little more for the bird wave guide stitching) turns into 2160 is because they use the polarized light or two different wave length RGB LD projecting into the left and right side of the waveguide…..
    2) about the disappeared laser speckles:
    the resolution is 2160×1440. the FOV is about 43 degrees. that means 43×60/2160=1.2 and is very close to the human eye definition ability. and we can not even distinguish the pixel dot, so we also can not distinguish the tiny speckles inside and mixed between pixels. but if we use a larger FOV(eg. 80 + degrees) optics infront of the LBS, we can absolutely see the obviously laser speckles… Actually we have verified it before in our experiments, both use mems and our Fiber Scanning Laser Display.

    • Regarding 1), I explained how I think it works and the numbers behind it in https://www.kguttag.com/2019/02/27/hololens-2-first-impressions-good-ergonomics-but-the-lbs-resolution-math-fails/. It looks to me like Hololens is doing 120Hz Interlaced which means the whole screen is refreshed at only 60Hz. This was proved in the 1980s to cause flicker.

      Regarding 2), In my experience in dealing with speckle, I don’t believe the speckle can be made to disappear to the eye by it being the same size as the pixel. I know that if you laser scan directly onto the eye that it does not cause speckle because there is no reflective/screen path that has the interference that causes speckle. But as you introduce pupil expansion, one would expect to start seeing speckle. Alex Kipman of Microsoft has said that there are anti-speckle mirrors in the HL2 that if they are “turned off” that there will be speckle (I don’t know whether this is true or not). I’m very curious about the speckle issues as based on my past experience, once as system has speckle it is extremely hard to eliminate. The only way I have seen that works with a projector is to vibrate the screen (this was done with the rear screen projected Laser TVs and with the Dolby Laser Theaters).

  6. question from a non-scientist.
    since there is flicker, why not install 2 sets of lasers and have them times so that they are both flickering but at exactly opposite times therefore creating a solid stream of light.
    ok. I may be a dumbass so go ahead, fire away

    • What you are suggesting would more than double the cost of the projector. Not only would you have two of everything, but then the two projectors would have to be perfectly aligned with each other.

      Some have suggested having multiple lines at a time, but once again you have to perfectly align two sets of laser but just one line apart.

  7. So Ram…the truck maker basically has 2 ram trucks. The older model (pre2019) which just gets a little update called the “classic” but internally stays the same. Then they have the new updated version. Curious if Microsoft, could switch back to the HL1 display but keep the new everything else? i.e. like Ram did and sell 2 different display versions?

    I ask because the D365 platform and its Guides & Remote assist software with spatial anchors (i.e. the cloud platform) is huge. My research has been in the use of Guides and whether techs would be interested, the demand is massive. Over the last 6 months, I have demo’d on HL1’s with tech’s who would be spending their personnel money to “buy the tool” and I have 100% buy rate. Literately the 115+ techs I have put on their heads and tested, every single one has said they would buy an HL1 if they could get the software. Which is why my company is testing. I guess asked another way, is the old display technology too slow to run the rest of the new hardware?

    • I have not heard anything about Microsoft continuing to make the Hololens 1. I would think they would have said something if they were going to continue making them.

      Hololens is already an extremely low volume product for Microsoft so I doubt (but I have no direct information) that they mix and match with the old hardware whether it would work or not.

  8. Buenas noches.
    Excelente artículo. Pregunta.. MIcrosoft va a comprar a Microvisión o es un bulo? He leído que está utilizando su tecnología en la xbox, y lo van a presentar este jueves.. verdad?

    Muchas gracias.

    Google Translation:
    Goodnight.
    Excellent article. Question .. Is MIcrosoft going to buy from Microvisión or is it a hoax? I have read that you are using your technology on the xbox, and they will present it this Thursday .. right?

    Thank you.

    • I have said in the article what I know. Microsoft would be the most likely company to buy Microvision, but I have NO information that Microsoft has made an offer.

      It is a near certainty that Microsoft has a license to Microvision’s LBS technology. But Microsoft would have certainly required that in the event that Microvision and its patent were acquired, that Microsoft could use the patent by paying the pre-agreed royalty. It is also a bit strange how Microsoft, particularly Alex Kipman, talks about Microsoft invented the technology with zero mentioning of Microvision.

      It is not clear that Microsofts volumes for laser beam scanning are for a long time if ever, are going to be very large and thus I would not expect the royalty value from the HL2 deal to be large.

  9. […] The Florida Independent is now saying they have confirmed that Microsoft is going to Aquire Microvision I would treat it as a rumor, but The Florida Independent claim it will be announced on May 12 (tomorrow). I found out about this news because of their article cited this blog and used pictures from Hololens 2 Video with Microvision “Easter Egg” Plus Some Hololens and Magic Leap Rumors. […]

    • NOTE: This article is from May 20th 2020 or over three years ago. It appears to be fake news that didn’t age well. Perhaps an analyst that was trying to pump the stock.

      I suspect that Microsoft negotiate for the rights to use the technology even if Microvision was acquired. They already got what they wanted and didn’t need to buy Microvision. The article cites “MicroVision’s Financial Advisor Craig-Hallum expects the officially acquisition announcement sometime after hours on Tuesday May 12th [year 2020]. Microsoft’s acquisition of it is expected to be officially signed, sealed, and delivered by late August.

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading