304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
Update Feb 5, 2018] – I have been informed that EPFL released a short video in 2012 (link here).
The video shows a large test rig with a person holding a mock-up of the glasses talking about the wonderful things it will do someday. This video has about 5 seconds of a low-resolution red (only) image with a person in a dark room lit by dim green lighting. Based on the information in patent application US 2015/0362734, it was probably shot with a “contact lens” over the camera.
[Update Feb 5, 2018] – I have been informed that EPFL release a short video in 2012
According to Bloomberg Technology, “Intel Is Said to Plan Sale of Majority Stake in AR Glasses Unit” (February 1, 2018). Apparently, Intel is trying to sell the unit off for $350 million. Intel put this unit together less than three years ago as EETimes reported in 2015, “Intel Buys Swiss Smartglasses Startups.” Brief excerpts from both these articles are shown with my markups (left).
Excuse me for putting a plugin for my services, what I am calling (tongue in cheek) Rent-A-Tech-Skeptic. Intel’s investments in VR and AR appear to be yet another example of the poor technical due diligence done when investing in AR (Magic Leap being the granddaddy of them all).
The whole concept, at a technical level, is fundamentally flawed. I have no beef with Intel; I would have been happy to have saved Intel hundreds of millions of dollars on their combined VR and AR investments if they had asked.
The Bloomberg article states (with my bold emphasis), “The spectacles will be able to display contextual information into the wearer’s field of view with a laser-based projector that reflects off the lens and onto the retina, the people said.” It also says, “Some former members of the Recon team are part of the division up for sale. It has about 200 employees in the U.S., Switzerland, and Israel.“
EETimes 2015 article states, “Lemoptix had been working with Composyt Light Labs SA when Intel acquired the latter at the end of 2014. Composyt was founded in 2014 to commercialize a see-through display architecture invented at EPFL [Ecole Polytechnique Fédérale de Lausanne]. The display integrates with normal eyewear and a smartglasses prototype developed by Composyt and Lemoptix includes direct retina projection and a patented holographic combiner [sic].” I think EETimes may have been confused because Composyt/EPFL was the developer of the holographic combiner. Lemoptics developed the laser scanning mirror for their laser beam scanning (LBS) engine.
A clue that this is nowhere near as easy as it may sound is given by them publicly discussing it in 2014, but there have been no public demonstrations in the last 3+ years. From what I can tell by looking through their patent applications, the whole concept is utterly impractical.
The reported sale of the AR Unit comes not long after it disclosing in Sept. 2017 that “Intel’s Project Alloy VR headset is dead.” The closure happened just about one year after “Intel shows off all-in-one Project Alloy virtual reality headset” (Aug 16, 2016). Recon had by July 2017 shut down Recon Instruments. Intel reportedly acquired Recon for $175M just two years earlier. Not to put too fine a point on it, but Intel does not appear to have an excellent track record in judging the potential of near-eye display developments.
The basic concept may seem simple, but in reality, it combines two bad ideas. The figure on the left, taken from an EPFL patent application 2015/0362734 (‘734), shows the concept. A laser been scanning projector (scanning mirror 107 show) in the temple of the glasses projects like toward a reflective holographic film 111 as a combiner that redirects the projected light toward the eye while allowing real-world light through.
The obvious question is why are they using a holographic film to redirect the light rather than a simple mirror like many other near eye optical designs? With a projector on the temple and to avoid being block by the user’s fact, the projected light is going to hit the lens at a shallow angle. If they used a semi-mirror combiner, the light would bounce at an angle that would miss the eye as shown by the dashed red line labeled “Mirror Reflection.” Therefore they must have this unique holographic film to redirect the light at a steeper angle.
At least this is how it is supposed to work. In 2014, Lemoptics showed this concept (left), made in cooperation with Composyt/EPFL. Apparently, it is a mock-up as there are no reports or videos of anyone using it.
This blog has been debunking the myths about laser beam scanning (LBS) displays for over six years. For those that have not seen my prior articles, I would suggest reading my “Cynic’s Guild to CES — Measuring Resolution” where I explain the Microvision’s LBS scanning process and why the resolution is so poor. Other good back articles from this blog include Celluon Laser Beam Scanning Projector Technical Analysis – Part 1, Celluon Laser Beam Steering Analysis Part 2 – “Never In-Focus Technology,” and Celluon/Sony/Microvision Optical Path.
Still the myths and lies about LBS persist. LBS has proven to be a useful technology for deceiving people because it looks so simple until you understand what is required to make it work. For those that want it short and sweet, some of the severe drawbacks to LBS:
In searching the Lemoptix/LBS patents related to the Intel acquisition, I didn’t find anything particularly unusual compared to what Microvision has been doing. They have several different schemes for reducing speckle, but I doubt any of them work very well (and thus they keep filing new ideas).
In short, LBS does not solve near-eye displays; it just moves some of the problems around. It is a very high-cost way to achieve poor image quality. The one exception I might make is that for a select group of people with specific vision problems, it might be useful.
As discussed previously, Composyt/EPFL developed a holographic film for bending the laser light toward the eye. Sounds simple, doesn’t it? Just sprinkle some pixie dust and use the word “Hologram” and everything is solved, right? -Wrong.
Just for a moment, I’m going to ignore what appears to be a major problem for this concept, namely the exit pupil, and just discuss the issues with a holographic combiner. The goal of a combiner is to combine light from the real-world with the display while minimally impacting the image quality of either source of light.
There is always a tradeoff with a combiner. A simple partial mirror combiner, for example, trades brightness of the real world for the brightness of the image.
In this case, the hologram is doing a lot of “work” to bend the light and in some of the examples diffuse the light. There has to be negative effects light coming at the holographic combiner from the real world, particularly light coming at angles similar to that of the laser projector. One would also expect some scattering (blurring) and at least some color separation.
Simply put, the “pupil” is the area over which the light can be seen. When you look into a projector without a screen, all you see is either extremely bright light if you are looking into the lens or nothing. A projector screen diffuses the light and expands the pupil so you can see the image from different angles and so the pupil is very larger. In the case of spatial light modulators, going into a mirror, the light can be slightly diffuse before it is focused and reflected to create a larger pupil. With diffractive waveguides, the pupil ends up being the size of the exit grating. You need a larger pupil so the image can be seen from a wider angle. A larger pupil makes fit less critical and is necessary for a wider field of view (FOV).
In this case, you can see how serious a problem pupil expansion is for the holographic combiner by the radical “solutions” proposed. Figures 3A and 3B below illustrate the exit pupil problem. If you follow the light ray lines with the filled-in/dark arrows related to laser light 206 in Figure 3A, you will see it enter the pupil of the eye, but if the eye moves as in 3B, the light misses the eye. To “solve” this problem they project a second image indicated by path 301 with hollow arrows that miss the eye’s pupil in figure 3A and enter the pupil in figure 3B. The proposed solution is to track the eye and decided which image to project. But to do this, they then need to have three more lasers. As illustrated in figure 6 (above right), they propose to keep adding lasers to continue expanding the pupil.
But it gets worse. Not only are lasers being replicated, but it is also optically complicated to get the light from the many lasers coaxial so that they can all hit the beam scanning mirror at the proper angle. They show a series of ways to try and make many lasers coaxial and I have copied examples 9A, 9B, and 9D on the right. It is hard to imagine how this would ever work in practice, no less the cost. This is a desperate attempt at a solution.
In patent application 2015/0362734 they propose a different and less user-friendly alternative to pupil expansion by making the user wear complex contact lens (see left). In this case, the holographic screen to diffuse the light (similar to a projection screen) which will expand the pupil.
The problem then becomes it acts like a projection screen that is too near the eye and thus will be out of focus. To fix the focus problem, they propose the user must wear a contact lens with a small center focusing lens (503), with a bandpass filter (505) to block real world like from going through the middle. It then surrounded the outer part with a lens to aide vision correction of the real world but with a notch filter (504) to block the projector light and allow through the real world light.
Ignoring for a second the complexity and cost of the contact lens require, this “solution” has a lot of problems. The most obvious is that it requires that the person want and be able to wear contacts. Secondly, the color filters can’t be perfect and are inevitably going to tint the real world. I also wonder how well the contract would work in bright sunlight when the human pupil closes so that the central part of the contact is a relatively large area thus blurring and blocking the real world.
The contact lens idea is not new to EFPL/Intel. Innovega has been trying to do something similar with contact lens using conventional displays for at least as far back at 2013. Only the Intel/EPFL “solution” requires an even more complex lens with light filtering built-in.
What the two proposed ways to address pupil expansion demonstrate why you should never buy off on a simple mockup or figure. What starts out looking simple in a drawing can become very difficult by the time all the factors are taken into consideration.
Have ever seen a TV show where someone tries to see off a customization project (say a car or a house) where they have spent crazy amounts of money to build something garish that only they would want? They are inevitably told, “it is not how much you have spent, it is how much it is worth.” This is a classic example of something that looks simple but is insanely impractical when you understand what is required to make it work.
BTW, I’m extremely pro-startup (I love startups) and I know how hard it is for startups to get funding. I and understand big companies should take risks, but I like to see them invests in businesses and technologies that have real potential. Before you make a big investment in a new AR and VR company, you might want to Rent-A-Tech-Skeptic to check it out first.