304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
As I discussed in my last article on Digilens, I was out in San Jose on a quick trip. The purpose of that trip was to see EyeWay Vision (hereafter EyeWay). EyeWay has an ambitious program to try and solve many of the important problems in AR. They are developing a dual laser beam scanning (LBS) based, direct retinal projection, foveated display. Based on papers and patents, no less than Apple, Facebook, and Microsoft have all put some serious effort into LBS foveated displays.
I knew in advance that I would be looking through a rather big prototype and with limited functionality as several key aspects were still under development. Most importantly, the current prototype is “fixed foveated” and eye tracking is in the process of being integrated. EyeWay had a separate demonstration of their eye-tracking and plan to combine the retina-tracking with foveated image movement later this year.
Soon before I left on the trip, I asked EyeWay if I could take pictures through the optics, and they said I could which impressed me. It turns out that because of a mechanical restriction that would have required removing a part from the prototype, I could not get the best pictures possible. EyeWay offered to retake the pictures with their prototype, then in Israel, and send me unretouched pictures. I can confirm that these pictures are consistent with what I saw and what I was able to take with my own camera.
The purpose of the prototype is to demonstrate the potential of the technical approach. This is a bit of a peek behind the curtain of an R&D project. As with all my articles, most of the images can be clicked on to see a bigger, higher resolution version.
I want to show what EyeWay is doing in context and explore some of the key technical issues. Because EyeWay is attempting to address so many different AR issues all at once, it will require a multi-part series.
This first part will cover some of the basic concepts of foveated displays and how EyeWay’s approach differs from other developments. The next article will show more “through the optics” pictures. I plan on discussing the combiner optics and the issues with reducing the size of the system and wrap up the series with some conclusions and analysis (in one or two parts).
First, I want to point out that with a foveated display, by definition, it tricks the eye into seeing a high-resolution image by moving a “foveal display” to wherever the eye’s fovea has been aimed any instant in time. When you take a picture of a foveated display, the camera captures the individual high-resolution foveal image, a transition/blending region, and the lower resolution peripheral image.
I also want to say that I have seen many different LBS-generated images with my own eye, both with front projectors and near-eye headsets. The image quality of the foveal image is impressively good for an LBS-generated image and better than needed for most AR applications.
Below is a through-the-optics picture from EyeWay’s prototype and a picture of took of the same image at the same scale on a Hololens 2 (LBS projector into a diffractive waveguide) that I took. In both cases the left eye was centered in the frame.
Not only is EyeWay’s foveal part of the image much better in every way than the Hololens 2, but even EyeWay’s peripheral image also looks better. Note how you can see scan lines in the Hololens 2 image, whereas they are invisible in both regions of EyeWay’s prototype.
I will be discussing EyeWay’s image quality in future articles, but I wanted to give you a quick taste.
Before diving into many technical points, I also wanted to comment on EyeWay’s captivating dragon demo. The real-time dragon animation can’t be captured well with a still camera or video camera due to the scanning process of the lasers. Even though the foveated display is currently fixed, you could move the display around to see detail in any area that you chose.
Unfortunately, it is one of those demos you have to see with your own eyes. For the rest of this article, I will be showing photos through the optics of still frames.
I would also want to point out that some of EyeWay’s demos show images looking out windows to prove that the display could be very bright when needed.
EyeWay is trying to address many different but interrelated issues in AR. In some ways, they are in a quest for the Holy Grail (as EyeWay has put it) to solve many/most of the major known problems with AR all at once.
EyeWay is currently IMO a long way from reaching their goal, but they are working on some important problems with interesting technical approaches. Their display approach, using laser scanning, forces them to solve perhaps the hardest problem, that of precise eye tracking, to have a usable display.
This blog has written about foveated displays many times (link to other articles)and see Appendix). Briefly, human vision does not work like a camera taking a single snapshot with an array of pixels of all the same size/resolution.
There is only a very small region of the eye with a high density of cone cells that support high resolution and distinguish color. The charts below show the center regions of the retinal (left), the physical distribution of cones in the eye (center), and the relative visual acuity (right) in degrees from the center of the fovea. The fovea with the highest visual acuity covers only about 3-degrees of the eye’s FOV. As the acuity chart shows, human vision falls off sharply from the center of the fovea.
The eye is continuously moving with a rapid series of jump-like motions known as saccades. The human vision system blanks out between saccades, a process known as saccadic masking (a 3-minute video on saccadic masking here). The human visual system then builds up what one sees by stitching together a series of “snapshots” at various resolutions at each saccade. It is an amazingly complex process that is imperceptible most of the time.
A true foveated display has a high angular resolution “foveal projector” that moves/tracks the eye’s saccades. The foveal projector always presents a high-resolution image centered on the fovea but with a small FOV. There is then at least one “peripheral projector” with a much wider FOV but a lower angular resolution to fill the periphery. If done correctly, the human eye should perceive a single, very high-resolution display.
EyeWay has dual LBS projectors, each with nominally the same SVGA (800×600 pixel) resolution. But while they both project onto the retina, the foveal projector is not only covering a smaller area with higher resolution, it supports variable focusing.
Classical laser beam scanning projectors are “focus-free” and don’t require optics to see an image. Projectors that don’t need optics for focusing are known as “Maxwellian Displays.” For an excellent discussion of Maxwellian Displays, I would recommend downloading Accommodation-Free Head Mounted Display with Comfortable 3D Perception and an Enlarged Eye-box. While Maxwellian displays are usually associated with laser beam scanning, as the paper points out, other forms of displays can also be Maxellian (see right).
I should note that while Hololens 2 uses an LBS projector, by the time it gets through the pupil expansion of the waveguide, it is no longer Maxwellian (including focus-free).
A Maxwellian display is focus-free and with highly collimated and casts very sharp shadows if anything gets in the way of the beam.
Almost all people by age 30 have “floaters” (non-harmful material suspended in the eye’s vitreous humor). They will mostly go unnoticed unless someone shines a light in the eye. Unfortunately, these floaters become very obvious when a Maxwellian projector projects into the eye. I have seen this problem with every other direct retinal laser scanning display I have tried, including the QD Laser’s RETISSA, North Focals, and Bosch’s Laser AR glasses at CES 2020 (below).
While EyeWay’s peripheral display is Maxwellian, and I could see my eye’s floaters when I looked away from the center, the foveal display is not Maxwellian. I could not see any floaters in the foveal image.
The lack of floaters in the foveal image proves that EyeWay is doing something very different than other laser scanning projectors.
Additionally, EyeWay can control the apparent focal point of the foveal projector to support vergence/accommodation conflict mitigation.
A traditional problem Maxwellian near-eye displays and specifically retinal laser scanning is that the projected image either goes into the pupil or misses it altogether. All other display types avoid this problem by generating a large eye box so that the image is visible wherever the eye moves relative to the glasses.
But generating a large eye box means that the vast majority of light is wasted as the only light that counts is that which goes into the pupil. In terms of social interaction, it is more than a little strange to look at a person with their eye area lit up. It is even more important for military uses where lighting up the eye at night would show the user’s location.
North Focals which used direct retinal laser scanning had this problem which required each set of glasses to be custom fitted and even then it was tricky to see an image. North included a 4-way pupil replicator but this meant that while you had four orientations where you could see the image, it also meant that you might see partial and double images.
The Accommodation-Free Head Mounted Display with Comfortable 3D Perception and an Enlarged Eye-box paper used at 9-way replicating optics, to North Focals’ four, by using a bulky set of beams splitters. The pictures below are from a series of still frames in the related video (at the same link above) that show the effect of the alignment of the replicated images to the pupil.
EyeWay’s approach eliminates the need to generate an Eye Box bigger than the pupil by tracking the eye as demonstrated in the short video below taken from their presentation
While Microsoft’s Hololens 2 uses laser beam scanning, it is not Maxwellian after the waveguide. Their diffraction grating waveguide massively replicates the input pupil and creates a large eye box similar to the picture of the WaveOptics waveguide above.
The picture below is a crop from the center portion of EyeWay’s display showing the whole foveated region and the peripheral region. In this particular picture, the blanking roll bars (caught by the camera and not seen by a human) from the scanning process happen to be caught with both scan processes in the vertical part of the foveated region. The blanking lets you see the contribution from each display to the combined image.
The foveal image, including the transition region where both the foveal and peripheral image are evident, cover about 12° by 6.6° at about 60 pixels per degree (1 arcminute per pixel). The peripheral display then covers about 44° horizontally and 25° degrees vertically (about 50° diagonally) with about 18 pixels per degree (~3.3 arcminutes/pixel).
EyeWay’s foveal image is about 1/2 the horizontal and vertical angular size of Varjo’s AWE 2019 fixed foveated passthrough AR (VR with camera passthrough) display. The foveal part of the Varjo display was about 21° (H) by 13° (V) including the transition blending area. But then the Vajro passthrough AR is “fixed foveated.”
Varjo talked about an eye-tracking foveated display in 2017 (left is my drawing of what Varjo described) but ended up going to market with a “fixed foveated” (non-moving high-resolution center region) display with passthrough AR as shown in 2018. EyeWay’s approach with optical AR has a smaller foveated display and depends on the foveated display tracking the retina to keep the user noticing the foveated to the peripheral boundary.
EyeWay requires eye tracking not only to make the foveated display work but for the image to make it into the eye’s pupil. EyeWay has no option but to solve precise eye-tracking for them to have any solution. Unlike Varjo, EyeWay can’t fall back on a fixed foveated display.
The concept of using eye tracking in combination with a laser scanning retinal projector goes back at least as far at 1995 as shown in an (expired) patent by Tom Furness.
EyeWay demonstrated their retina tracking to me, but it was shown on laboratory instruments and not yet controlling the foveal projected image.
Most other eye-tracking efforts only consider the cornea and the iris. Facebook‘s Michael Abrash, Chief Scientist at Oculus, has an excellent 3-minute discussion (taken from a much longer 2016 Oculus Connect 3 presentation) of the issues with foveated display eye-tracking, the need for precision, and the problems with the cornea and iris tracking.
Facebook Reality Labs regularly discusses eye-tracking and foveated displays. Doug Lanman, Director of Display Systems Research at Facebook Reality Labs, discusses eye tracking with laser scanning foveated displays in a video from the SPIE AR/VR/MR 2020 conference) issues related to pupil steering, and in the 2019 paper, he references in the video. A key point here is that foveated laser scanning-based displays are getting a lot of attention at Facebook AR Research. I should note that EyeWay has been working on this technology since 2014 when the company was founded.
EyeWay points out that they use two foveal tracking mirrors rather than a single one, as suggested by Vajro comments back in 2017 and as shown by Avegant (video here). EyeWay says that having two steering mirrors supports 4 degrees of freedom and will better track the eye.
EyeWay’s solution takes eye-tracking to the next level by tracking the retina to get the precision they feel is necessary for a foveated display. EyeWay’s eye-tracking uses multiple stages, with the most precise stage tracking the retina via the same path that moves the foveal image.
The short video on the left shows the eye-tracking path with IR illumination (top) path (purple), sharing the display path (red cones) and retina image return path (green) to a camera (blue).
Independent of the foveated laser scanning display, some EyeWay’s eye-tracking developments could be valuable in and of themselves.
Facebook Reality Labs regularly discusses in papers and videos issues related to pupil steering, projecting into the retina with a tiny eye box using eye-tracking (example below)
The basic concept of a foveated display fairly old, and many companies have done work at one time or another on the concept both for VR and AR. Below is a collection of foveated display concepts by Microsoft, Avegant, Apple, and Varjo, to show just a few. As discussed above, Facebook has also been seriously working in the area of foveated displays, eye tracking, and retinal laser scanning for many years.
I don’t have time to go into the details of each approach here, but I want to point out that foveated displays with and without laser scanning being considered by all the billion-dollar giants working in AR and a range of startups.
Interestingly, both Microsoft and Apple show laser scanning approaches above that would have to reflect off a semi-reflective or holographic mirror to work. Avegant is the only company showing a waveguide-based foveated display. Varjo’s has been concentrating on passthrough AR (VR with cameras to show the real world), optically vastly simpler.
While Varjo talked about tracking the eye back in 2017, I have yet to see one where the foveated display moves. The devices they ship and any demos I have seen all use a “fixed foveated display” where the high-resolution foveated center does not move. The fixed display works most of the time center of the eye usually stays aimed within about 30 degrees.
The next article in this series on EyeWay is going to show more pictures through the lens of the EyeWay prototype. After that article, I will go back to analyzing the pros, cons, and challenges of the EyeWay’s approach to AR.