Intel AR “Fixer-Upper” For Sale? Only $350M ???

Update Feb 5, 2018] – I have been informed that EPFL released a short video in 2012 (link here).

The video shows a large test rig with a person holding a mock-up of the glasses talking about the wonderful things it will do someday. This video has about 5 seconds of a low-resolution red (only) image with a person in a dark room lit by dim green lighting. Based on the information in patent application US 2015/0362734, it was probably shot with a “contact lens” over the camera. 


[Update Feb 5, 2018] – I have been informed that EPFL release a short video in 2012 

According to Bloomberg Technology, “Intel Is Said to Plan Sale of Majority Stake in AR Glasses Unit” (February 1, 2018). Apparently, Intel is trying to sell the unit off for $350 million. Intel put this unit together less than three years ago as EETimes reported in 2015, “Intel Buys Swiss Smartglasses Startups.” Brief excerpts from both these articles are shown with my markups (left).

Rent-A-Tech-Skeptic™ Before Throwing Away Money

Excuse me for putting a plugin for my services, what I am calling (tongue in cheek) Rent-A-Tech-Skeptic. Intel’s investments in VR and AR appear to be yet another example of the poor technical due diligence done when investing in AR (Magic Leap being the granddaddy of them all).

The whole concept, at a technical level, is fundamentally flawed. I have no beef with Intel; I would have been happy to have saved Intel hundreds of millions of dollars on their combined VR and AR investments if they had asked.

The Startups Intel Bought to Form the AR Unit

The Bloomberg article states (with my bold emphasis), “The spectacles will be able to display contextual information into the wearer’s field of view with a laser-based projector that reflects off the lens and onto the retina, the people said.” It also says, “Some former members of the Recon team are part of the division up for sale. It has about 200 employees in the U.S., Switzerland, and Israel.

EETimes 2015 article states, “Lemoptix had been working with Composyt Light Labs SA when Intel acquired the latter at the end of 2014. Composyt was founded in 2014 to commercialize a see-through display architecture invented at EPFL [Ecole Polytechnique Fédérale de Lausanne]. The display integrates with normal eyewear and a smartglasses prototype developed by Composyt and Lemoptix includes direct retina projection and a patented holographic combiner [sic].” I think EETimes may have been confused because Composyt/EPFL was the developer of the holographic combiner.  Lemoptics developed the laser scanning mirror for their laser beam scanning (LBS) engine.

A clue that this is nowhere near as easy as it may sound is given by them publicly discussing it in 2014, but there have been no public demonstrations in the last 3+ years. From what I can tell by looking through their patent applications, the whole concept is utterly impractical.

Intel’s Recent Forays Into VR and Non-See-Through “AR”

The reported sale of the AR Unit comes not long after it disclosing in Sept. 2017 that “Intel’s Project Alloy VR headset is dead.” The closure happened just about one year after “Intel shows off all-in-one Project Alloy virtual reality headset” (Aug 16, 2016).  Recon had by July 2017 shut down Recon Instruments. Intel reportedly acquired Recon for $175M just two years earlier. Not to put too fine a point on it, but Intel does not appear to have an excellent track record in judging the potential of near-eye display developments.

Diving Into The AR LBS Technology’s Intel Bought (How It Works)

The basic concept may seem simple, but in reality, it combines two bad ideas. The figure on the left, taken from an EPFL patent application 2015/0362734 (‘734), shows the concept. A laser been scanning projector (scanning mirror 107 show) in the temple of the glasses projects like toward a reflective holographic film 111 as a combiner that redirects the projected light toward the eye while allowing real-world light through.

The obvious question is why are they using a holographic film to redirect the light rather than a simple mirror like many other near eye optical designs? With a projector on the temple and to avoid being block by the user’s fact, the projected light is going to hit the lens at a shallow angle. If they used a semi-mirror combiner, the light would bounce at an angle that would miss the eye as shown by the dashed red line labeled “Mirror Reflection.” Therefore they must have this unique holographic film to redirect the light at a  steeper angle.

At least this is how it is supposed to work. In 2014, Lemoptics showed this concept (left), made in cooperation with Composyt/EPFL. Apparently, it is a mock-up as there are no reports or videos of anyone using it.

Laser Beam Scanning – Lemoptix Technology

Based patents and literature, Lemoptix laser beam scanning (LBS) technology is very similar to Microvision’s single mirror LBS.

This blog has been debunking the myths about laser beam scanning (LBS) displays for over six years. For those that have not seen my prior articles, I would suggest reading my “Cynic’s Guild to CES — Measuring Resolution” where I explain the Microvision’s LBS scanning process and why the resolution is so poor. Other good back articles from this blog include Celluon Laser Beam Scanning Projector Technical Analysis – Part 1Celluon Laser Beam Steering Analysis Part 2 – “Never In-Focus Technology,” and Celluon/Sony/Microvision Optical Path.

Still the myths and lies about LBS persist. LBS has proven to be a useful technology for deceiving people because it looks so simple until you understand what is required to make it work. For those that want it short and sweet, some of the severe drawbacks to LBS:

  1. The resolution does not scale up – The can’t move the mirror fast enough or accurately enough. I have measured LBS technology time and again over the last eight years, and it has stayed far behind other technologies in resolution.
  2. Lasers are costly compared to LEDs by about a factor of 10X – There are also not the market volumes, particularly for green lasers to drive their cost down.
  3. The scanning process is a Lissajous pattern and not a rectangular raster – This means that all images have are rescaled onto the LBS’s non-linear scan which causes an inherent Nyquist resampling loss.
  4. The pixel spacing is not uniform – This is inherent in the Lissajous scanning process and made worse by most companies using “interlaced” scanning.
  5. Flicker – Every laser projector I have seen has an issue with flicker. Most use 60Hz-interlaced scanning which causes the outsides of the image to be refreshed at only 30Hz. The low refresh rate combined with zero persistence causes significant 30Hz flicker.
  6. Small Exit Pupil/FOV– The laser light can only be seen over a minimal area which means the position of the eye is critical to either seeing a bright image or none at all. The exit pupil is a severe issue with the Intel/EPFL design that I will expand on later.
  7. Shadows caused by “floaters” in the eye (in near eye use)  The laser light has such a high f-number that anything in the eye such as “floaters” which with “normal” light cast sharp shadows.
  8. The scan of the laser beam is constantly accelerating and decelerating crossing through zero velocity at each side of the horizontal and vertical scan. This also means that the laser drive must drive the laser a very low amplitude for longer pixel times and very hard for shorter pixel times as the center of the scan when the beam is moving the fastest
  9. Laser safety both real (which there are) and perceived by consumers.
  10. Laser Speckle – Lasers have coherent light that causes speckle which is extremely difficult to eliminate. Wider spectrum (line width) lasers can reduce speckle. Lemoptix has multiple patent applications directed at speckle reduction. No speckle reduction technique works perfectly. A vibrating screen works well for front projection, but that is not an option for near eye. All speckle reduction schemes come at the cost of image quality, size, and cost.
  11. Almost all other near eye technologies are more power efficient – the efficiency losses with LBS are more hidden in the drive of the lasers and control of the mirror.
  12. Emitting technologies (OLED and Micro-LED) are much smaller, more compact, and with better power efficiency.

In searching the Lemoptix/LBS patents related to the Intel acquisition, I didn’t find anything particularly unusual compared to what Microvision has been doing. They have several different schemes for reducing speckle, but I doubt any of them work very well (and thus they keep filing new ideas).

In short, LBS does not solve near-eye displays; it just moves some of the problems around. It is a very high-cost way to achieve poor image quality. The one exception I might make is that for a select group of people with specific vision problems, it might be useful.

Holographic Mirror Combiner – Composyt/EPFL

As discussed previously, Composyt/EPFL developed a holographic film for bending the laser light toward the eye. Sounds simple, doesn’t it? Just sprinkle some pixie dust and use the word “Hologram” and everything is solved, right? -Wrong.

Just for a moment, I’m going to ignore what appears to be a major problem for this concept, namely the exit pupil, and just discuss the issues with a holographic combiner. The goal of a combiner is to combine light from the real-world with the display while minimally impacting the image quality of either source of light.

There is always a tradeoff with a combiner. A simple partial mirror combiner, for example, trades brightness of the real world for the brightness of the image.

In this case, the hologram is doing a lot of “work” to bend the light and in some of the examples diffuse the light. There has to be negative effects light coming at the holographic combiner from the real world, particularly light coming at angles similar to that of the laser projector. One would also expect some scattering (blurring) and at least some color separation.

Simply put, the “pupil” is the area over which the light can be seen. When you look into a projector without a screen, all you see is either extremely bright light if you are looking into the lens or nothing. A projector screen diffuses the light and expands the pupil so you can see the image from different angles and so the pupil is very larger. In the case of spatial light modulators, going into a mirror, the light can be slightly diffuse before it is focused and reflected to create a larger pupil. With diffractive waveguides, the pupil ends up being the size of the exit grating. You need a larger pupil so the image can be seen from a wider angle. A larger pupil makes fit less critical and is necessary for a wider field of view (FOV).

In this case, you can see how serious a problem pupil expansion is for the holographic combiner by the radical “solutions” proposed. Figures 3A and 3B below illustrate the exit pupil problem. If you follow the light ray lines with the filled-in/dark arrows related to laser light 206 in Figure 3A, you will see it enter the pupil of the eye, but if the eye moves as in 3B, the light misses the eye. To “solve” this problem they project a second image indicated by path 301 with hollow arrows that miss the eye’s pupil in figure 3A and enter the pupil in figure 3B. The proposed solution is to track the eye and decided which image to project. But to do this, they then need to have three more lasers. As illustrated in figure 6 (above right), they propose to keep adding lasers to continue expanding the pupil.

But it gets worse. Not only are lasers being replicated, but it is also optically complicated to get the light from the many lasers coaxial so that they can all hit the beam scanning mirror at the proper angle. They show a series of ways to try and make many lasers coaxial and I have copied examples 9A, 9B, and 9D on the right. It is hard to imagine how this would ever work in practice, no less the cost. This is a desperate attempt at a solution.

In patent application 2015/0362734 they propose a different and less user-friendly alternative to pupil expansion by making the user wear complex contact lens (see left). In this case, the holographic screen to diffuse the light (similar to a projection screen) which will expand the pupil.

The problem then becomes it acts like a projection screen that is too near the eye and thus will be out of focus. To fix the focus problem, they propose the user must wear a contact lens with a small center focusing lens (503), with a bandpass filter (505) to block real world like from going through the middle.  It then surrounded the outer part with a lens to aide vision correction of the real world but with a notch filter (504) to block the projector light and allow through the real world light.

Ignoring for a second the complexity and cost of the contact lens require, this “solution” has a lot of problems. The most obvious is that it requires that the person want and be able to wear contacts. Secondly, the color filters can’t be perfect and are inevitably going to tint the real world. I also wonder how well the contract would work in bright sunlight when the human pupil closes so that the central part of the contact is a relatively large area thus blurring and blocking the real world.

The contact lens idea is not new to EFPL/Intel. Innovega has been trying to do something similar with contact lens using conventional displays for at least as far back at 2013. Only the Intel/EPFL “solution” requires an even more complex lens with light filtering built-in.

What the two proposed ways to address pupil expansion demonstrate why you should never buy off on a simple mockup or figure. What starts out looking simple in a drawing can become very difficult by the time all the factors are taken into consideration.

It is not how much you have spent; it is how much it is worth

Have ever seen a TV show where someone tries to see off a customization project (say a car or a house) where they have spent crazy amounts of money to build something garish that only they would want? They are inevitably told, “it is not how much you have spent, it is how much it is worth.” This is a classic example of something that looks simple but is insanely impractical when you understand what is required to make it work.

BTW, I’m extremely pro-startup (I love startups) and I know how hard it is for startups to get funding. I and understand big companies should take risks, but I like to see them invests in businesses and technologies that have real potential. Before you make a big investment in a new AR and VR company, you might want to Rent-A-Tech-Skeptic to check it out first.

Karl Guttag
Karl Guttag
Articles: 244


    • Yes, they Intel did buy an interest in Vuzix and later sold it and [they have not sold it].

      Interestingly, it is a completely impractical idea that they are trying to sell rather than shut down. All the things they shut down or got out of could at least be made.

      [Edit to correct my response. I have been informed that Intel has NOT sold shares. It was reported that Intel and Vuzix ended a partnership, but it did not say that any shares were sold. Source: ]

      • Thanks and I apologize. I have corrected my response. There was some honest confusion over the report that Intel had ended a partnership with Vuzix but this did not involve selling shares.

    • Thanks, I saw the article. I’m planning on a follow up later this week (hopefully by tomorrow, I’m busy today). Very briefly:

      They have about a 400×150 (no spec’s give, seems to be the author’s impression) single color (red) image with a tiny eye box and small FOV that they turned into a “feature” of making the image disappear. They don’t let the author take glasses outdoors where he can see any issue of diffraction of the real-world view looking through the diffractive hologram.

      As the patents in the article show, expanding the eye box is either going to take a massive increase in complexity or the user wearing a special contact.

      Supporting color makes the whole thing infinitely more complex and messes more with the real world view as they can’t tune the hologram to a single wavelength.

      For reference, an iWatch as 272 by 340 pixels (38mm) 312 by 390 pixels (42mm) with full color. So basically you are getting a watch size single color (red) image that you have to look in a very specific place to see.

      I would assume that the Lemoptics is part of their design but don’t know.

      • Intel announced they stopped collaborating with Vuzix; however, Intel filed a 13G last week confirming they still own every share. Intel has wasted every penny they have invested in Recon and others. They could have (and should have) bought out Vuzix when they had the chance.

    • I apologize, I think there is confusion based on the story that Intel ended their partnership with Vuzix and got mistranslated into Intel selling their shares. Below is a link to the story.

      By REUTERS November 16, 2016
      Wearable display maker Vuzix said on Wednesday that Intel had decided to halt its collaboration with the company related to the development of Internet-connected headsets.

      Vuzix’s shares (VUZI, -5.92%) slumped as much as 37.9% to $5 in after-hours trading on Wednesday.

      Intel said in a letter last week that Vuzix’s technology did not fit its “strategic plans” and that it was considering alternatives for its investment in the company, according to a regulatory filing from Vuzix.

      Intel (INTC, -3.68%) declined to comment.

      Rochester, New York-based Vuzix develops computerized, Internet-connected glasses and other video eyewear aimed at consumers, businesses and entertainment.

      Intel, the world’s largest chipmaker, had invested $24.8 million in the company in January last year, buying nearly 5 million of Vuzix’s shares, as it looked to venture into new markets such as smartwatches and other Internet-connected wearable devices.

      Vuzix said that the Intel deal had not contributed significant revenue to the company over the last two years.

      • Thanks Karl; perhaps I stand corrected.
        BTW, what in the heck is happening with EMAN???

  1. Karl,
    What is your take on the new Vuzix Blade smart glasses? Were you at CES and did you get a chance to demo them or meet with Paul Travers?

    • Strictly image quality wise, the Vuzix Blade/3000 glasses have diffractive waveguide issues similar to Hololens. But then Vuzix is not selling them to watch movies with great image fidelity. Vuzix is trying to serve their market. The rotated the image to portrait mode (long side vertical) to make the location of the image less critical. Basically, trading ease of use for revolution. They also have relatively good transparency at over 80% and are fairly light. They are targeting hands-free “enterprise” applications for that need a little bit of data snacking and maybe a little bit of video over an extended period of time where the lightweight and good transparency are important.

Leave a Reply