Exclusive: EyeWay Vision Part 1: Foveated Laser Scanning Display

Introduction

As I discussed in my last article on Digilens, I was out in San Jose on a quick trip. The purpose of that trip was to see EyeWay Vision (hereafter EyeWay). EyeWay has an ambitious program to try and solve many of the important problems in AR. They are developing a dual laser beam scanning (LBS) based, direct retinal projection, foveated display. Based on papers and patents, no less than Apple, Facebook, and Microsoft have all put some serious effort into LBS foveated displays.

EyeWay Prototype System

I knew in advance that I would be looking through a rather big prototype and with limited functionality as several key aspects were still under development. Most importantly, the current prototype is “fixed foveated” and eye tracking is in the process of being integrated. EyeWay had a separate demonstration of their eye-tracking and plan to combine the retina-tracking with foveated image movement later this year.

Soon before I left on the trip, I asked EyeWay if I could take pictures through the optics, and they said I could which impressed me. It turns out that because of a mechanical restriction that would have required removing a part from the prototype, I could not get the best pictures possible. EyeWay offered to retake the pictures with their prototype, then in Israel, and send me unretouched pictures. I can confirm that these pictures are consistent with what I saw and what I was able to take with my own camera.

The purpose of the prototype is to demonstrate the potential of the technical approach. This is a bit of a peek behind the curtain of an R&D project. As with all my articles, most of the images can be clicked on to see a bigger, higher resolution version.

Series of Articles

I want to show what EyeWay is doing in context and explore some of the key technical issues. Because EyeWay is attempting to address so many different AR issues all at once, it will require a multi-part series.

This first part will cover some of the basic concepts of foveated displays and how EyeWay’s approach differs from other developments. The next article will show more “through the optics” pictures. I plan on discussing the combiner optics and the issues with reducing the size of the system and wrap up the series with some conclusions and analysis (in one or two parts).

Quick Comparison of EyeWay to Hololens 2

First, I want to point out that with a foveated display, by definition, it tricks the eye into seeing a high-resolution image by moving a “foveal display” to wherever the eye’s fovea has been aimed any instant in time. When you take a picture of a foveated display, the camera captures the individual high-resolution foveal image, a transition/blending region, and the lower resolution peripheral image.

I also want to say that I have seen many different LBS-generated images with my own eye, both with front projectors and near-eye headsets. The image quality of the foveal image is impressively good for an LBS-generated image and better than needed for most AR applications.

Below is a through-the-optics picture from EyeWay’s prototype and a picture of took of the same image at the same scale on a Hololens 2 (LBS projector into a diffractive waveguide) that I took. In both cases the left eye was centered in the frame.

Not only is EyeWay’s foveal part of the image much better in every way than the Hololens 2, but even EyeWay’s peripheral image also looks better. Note how you can see scan lines in the Hololens 2 image, whereas they are invisible in both regions of EyeWay’s prototype.

I will be discussing EyeWay’s image quality in future articles, but I wanted to give you a quick taste.

Intriguing Dragon Demo

Before diving into many technical points, I also wanted to comment on EyeWay’s captivating dragon demo. The real-time dragon animation can’t be captured well with a still camera or video camera due to the scanning process of the lasers. Even though the foveated display is currently fixed, you could move the display around to see detail in any area that you chose.

Unfortunately, it is one of those demos you have to see with your own eyes. For the rest of this article, I will be showing photos through the optics of still frames.

I would also want to point out that some of EyeWay’s demos show images looking out windows to prove that the display could be very bright when needed.

Quest for the “Holy Grail”

EyeWay is trying to address many different but interrelated issues in AR. In some ways, they are in a quest for the Holy Grail (as EyeWay has put it) to solve many/most of the major known problems with AR all at once.

  • Both high resolution and wide field of view (FOV)
  • Very Bright (supports more than 10,000 nits) to support outdoor use to very dim (~1 nit) to support nighttime use
  • Projects only into the retina and not the surrounding eye (both and efficiency and a social issue)
  • Highly accurate tracking of the retina and not just the cornea
  • Highly efficient while supporting large eye movement
  • Low power while supporting high brightness
  • Address vergence/accommodation conflict (VAC)
  • Small and Light while supporting all the above

EyeWay is currently IMO a long way from reaching their goal, but they are working on some important problems with interesting technical approaches. Their display approach, using laser scanning, forces them to solve perhaps the hardest problem, that of precise eye tracking, to have a usable display.

Foveated Display Background

This blog has written about foveated displays many times (link to other articles)and see Appendix). Briefly, human vision does not work like a camera taking a single snapshot with an array of pixels of all the same size/resolution.

There is only a very small region of the eye with a high density of cone cells that support high resolution and distinguish color. The charts below show the center regions of the retinal (left), the physical distribution of cones in the eye (center), and the relative visual acuity (right) in degrees from the center of the fovea. The fovea with the highest visual acuity covers only about 3-degrees of the eye’s FOV. As the acuity chart shows, human vision falls off sharply from the center of the fovea.

The eye is continuously moving with a rapid series of jump-like motions known as saccades. The human vision system blanks out between saccades, a process known as saccadic masking (a 3-minute video on saccadic masking here). The human visual system then builds up what one sees by stitching together a series of “snapshots” at various resolutions at each saccade. It is an amazingly complex process that is imperceptible most of the time.

A true foveated display has a high angular resolution “foveal projector” that moves/tracks the eye’s saccades. The foveal projector always presents a high-resolution image centered on the fovea but with a small FOV. There is then at least one “peripheral projector” with a much wider FOV but a lower angular resolution to fill the periphery. If done correctly, the human eye should perceive a single, very high-resolution display.

Dual Laser Beam Scanning (LBS) Direct Retinal Projection

EyeWay has dual LBS projectors, each with nominally the same SVGA (800×600 pixel) resolution. But while they both project onto the retina, the foveal projector is not only covering a smaller area with higher resolution, it supports variable focusing.

EyeWay’s Foveal Projector is Not “Maxwellian” – Different than other LBS Projectors

Classical laser beam scanning projectors are “focus-free” and don’t require optics to see an image. Projectors that don’t need optics for focusing are known as “Maxwellian Displays.” For an excellent discussion of Maxwellian Displays, I would recommend downloading Accommodation-Free Head Mounted Display with Comfortable 3D Perception and an Enlarged Eye-box. While Maxwellian displays are usually associated with laser beam scanning, as the paper points out, other forms of displays can also be Maxellian (see right).

I should note that while Hololens 2 uses an LBS projector, by the time it gets through the pupil expansion of the waveguide, it is no longer Maxwellian (including focus-free).

A Maxwellian display is focus-free and with highly collimated and casts very sharp shadows if anything gets in the way of the beam.

Almost all people by age 30 have “floaters” (non-harmful material suspended in the eye’s vitreous humor). They will mostly go unnoticed unless someone shines a light in the eye. Unfortunately, these floaters become very obvious when a Maxwellian projector projects into the eye. I have seen this problem with every other direct retinal laser scanning display I have tried, including the QD Laser’s RETISSA, North Focals, and Bosch’s Laser AR glasses at CES 2020 (below).

While EyeWay’s peripheral display is Maxwellian, and I could see my eye’s floaters when I looked away from the center, the foveal display is not Maxwellian. I could not see any floaters in the foveal image.

The lack of floaters in the foveal image proves that EyeWay is doing something very different than other laser scanning projectors.

Additionally, EyeWay can control the apparent focal point of the foveal projector to support vergence/accommodation conflict mitigation.

Eye Box Smaller than One’s Pupil

A traditional problem Maxwellian near-eye displays and specifically retinal laser scanning is that the projected image either goes into the pupil or misses it altogether. All other display types avoid this problem by generating a large eye box so that the image is visible wherever the eye moves relative to the glasses.

But generating a large eye box means that the vast majority of light is wasted as the only light that counts is that which goes into the pupil. In terms of social interaction, it is more than a little strange to look at a person with their eye area lit up. It is even more important for military uses where lighting up the eye at night would show the user’s location.

North Focals which used direct retinal laser scanning had this problem which required each set of glasses to be custom fitted and even then it was tricky to see an image. North included a 4-way pupil replicator but this meant that while you had four orientations where you could see the image, it also meant that you might see partial and double images.

The Accommodation-Free Head Mounted Display with Comfortable 3D Perception and an Enlarged Eye-box paper used at 9-way replicating optics, to North Focals’ four, by using a bulky set of beams splitters. The pictures below are from a series of still frames in the related video (at the same link above) that show the effect of the alignment of the replicated images to the pupil.

EyeWay’s approach eliminates the need to generate an Eye Box bigger than the pupil by tracking the eye as demonstrated in the short video below taken from their presentation

EyeWay Eye Tracking and Light Efficiency Over Waveguides (Click to Start Video)

While Microsoft’s Hololens 2 uses laser beam scanning, it is not Maxwellian after the waveguide. Their diffraction grating waveguide massively replicates the input pupil and creates a large eye box similar to the picture of the WaveOptics waveguide above.

EyeWay’s Dual Scan Foveated Display

The picture below is a crop from the center portion of EyeWay’s display showing the whole foveated region and the peripheral region. In this particular picture, the blanking roll bars (caught by the camera and not seen by a human) from the scanning process happen to be caught with both scan processes in the vertical part of the foveated region. The blanking lets you see the contribution from each display to the combined image.

The foveal image, including the transition region where both the foveal and peripheral image are evident, cover about 12° by 6.6° at about 60 pixels per degree (1 arcminute per pixel). The peripheral display then covers about 44° horizontally and 25° degrees vertically (about 50° diagonally) with about 18 pixels per degree (~3.3 arcminutes/pixel).

EyeWay’s foveal image is about 1/2 the horizontal and vertical angular size of Varjo’s AWE 2019 fixed foveated passthrough AR (VR with camera passthrough) display. The foveal part of the Varjo display was about 21° (H) by 13° (V) including the transition blending area. But then the Vajro passthrough AR is “fixed foveated.”

Varjo talked about an eye-tracking foveated display in 2017 (left is my drawing of what Varjo described) but ended up going to market with a “fixed foveated” (non-moving high-resolution center region) display with passthrough AR as shown in 2018. EyeWay’s approach with optical AR has a smaller foveated display and depends on the foveated display tracking the retina to keep the user noticing the foveated to the peripheral boundary.

EyeWay requires eye tracking not only to make the foveated display work but for the image to make it into the eye’s pupil. EyeWay has no option but to solve precise eye-tracking for them to have any solution. Unlike Varjo, EyeWay can’t fall back on a fixed foveated display.

Retina rather than Pupil Tracking

The concept of using eye tracking in combination with a laser scanning retinal projector goes back at least as far at 1995 as shown in an (expired) patent by Tom Furness.

EyeWay demonstrated their retina tracking to me, but it was shown on laboratory instruments and not yet controlling the foveal projected image.

Most other eye-tracking efforts only consider the cornea and the iris. Facebook‘s Michael Abrash, Chief Scientist at Oculus, has an excellent 3-minute discussion (taken from a much longer 2016 Oculus Connect 3 presentation) of the issues with foveated display eye-tracking, the need for precision, and the problems with the cornea and iris tracking.

Facebook Reality Labs regularly discusses eye-tracking and foveated displays. Doug Lanman,  Director of Display Systems Research at Facebook Reality Labs, discusses eye tracking with laser scanning foveated displays in a video from the SPIE AR/VR/MR 2020 conference) issues related to pupil steering, and in the 2019 paper, he references in the video. A key point here is that foveated laser scanning-based displays are getting a lot of attention at Facebook AR Research. I should note that EyeWay has been working on this technology since 2014 when the company was founded.

EyeWay points out that they use two foveal tracking mirrors rather than a single one, as suggested by Vajro comments back in 2017 and as shown by Avegant (video here). EyeWay says that having two steering mirrors supports 4 degrees of freedom and will better track the eye.

EyeWay’s solution takes eye-tracking to the next level by tracking the retina to get the precision they feel is necessary for a foveated display. EyeWay’s eye-tracking uses multiple stages, with the most precise stage tracking the retina via the same path that moves the foveal image.

The short video on the left shows the eye-tracking path with IR illumination (top) path (purple), sharing the display path (red cones) and retina image return path (green) to a camera (blue).

Independent of the foveated laser scanning display, some EyeWay’s eye-tracking developments could be valuable in and of themselves.

Facebook Reality Labs regularly discusses in papers and videos issues related to pupil steering, projecting into the retina with a tiny eye box using eye-tracking (example below)

Many Companies Are Working on Foveated Displays

The basic concept of a foveated display fairly old, and many companies have done work at one time or another on the concept both for VR and AR. Below is a collection of foveated display concepts by Microsoft, Avegant, Apple, and Varjo, to show just a few. As discussed above, Facebook has also been seriously working in the area of foveated displays, eye tracking, and retinal laser scanning for many years.

I don’t have time to go into the details of each approach here, but I want to point out that foveated displays with and without laser scanning being considered by all the billion-dollar giants working in AR and a range of startups.

Interestingly, both Microsoft and Apple show laser scanning approaches above that would have to reflect off a semi-reflective or holographic mirror to work. Avegant is the only company showing a waveguide-based foveated display. Varjo’s has been concentrating on passthrough AR (VR with cameras to show the real world), optically vastly simpler.

While Varjo talked about tracking the eye back in 2017, I have yet to see one where the foveated display moves. The devices they ship and any demos I have seen all use a “fixed foveated display” where the high-resolution foveated center does not move. The fixed display works most of the time center of the eye usually stays aimed within about 30 degrees.

Next Time – More Through The Lens Pictures

The next article in this series on EyeWay is going to show more pictures through the lens of the EyeWay prototype. After that article, I will go back to analyzing the pros, cons, and challenges of the EyeWay’s approach to AR.

Karl Guttag
Karl Guttag
Articles: 240

17 Comments

  1. Hi Karl,

    Thanks for your shearing and comments as always,
    Do you have more ideal about how the “Foveated LBS display” can support variable focus if “Maxiwellian” is not used, is it just like generlate an intermediate image and then projection into the eye, and variable focus by additional mechanical parts?

    Best,

    CC

    • It is a great question of how they get the foveal projector to be non-Maxwellian and EyeWay did not say. I did confirm with my own eyes that the foveal image behaves very differently than the peripheral image. The easy way would be to generate an intermediate image with, in effect, a “very high gain rear projection screen” but this might cause speckle (which I didn’t see). EyeWay also had a demo of the varying focus of the foveated image. They also did not say how they vary the focus, there are several known devices they could have used and the prototype was so large that it could be about anything.

      • Hi Karl

        Dr. Nikhil Balram has given a short talk which host by SID today, and it sounds that EyeWay use change beam size into the eye to achieve the so-called “vari. foucs” function, and Dr. Nikhil Balram say he can’t disclose more about this, ….. just for your reference.

        CC

      • Thanks for the information. There is a lot of evidence visually that EyeWay is doing something different to the foveal projected image.

      • Hi Karl, you are correct, it is a true non-Maxwellian system with continuous varifocal projection, resulting in ability to present virtual objects at any perceived distance from 20cm to infinity.
        As for how EyeWay is doing it, I would prefer to keep it as a proprietary information for now, let’s just say that any intermediate image manipulation would be very wasteful and bulky approach. EyeWays’ varifocal mechanism is efficient and compact, even at current stage of prototypes.

  2. Hi Karl, two questions:

    1) Was the foveal portion retina resolution? If so that’s a measly 13×10 degree FOV. I’m not aware of any eye tracking technology that is so accurate it can keep such a small FOV portion covering the fovea most of the time. That’s why I believe you need larger foveal FOV even if it is dynamically adjusted based on eye tracking.

    2) Any ideas on frame persistence? The photos with the blanking hint that it is very long ~90% frame persistence and not closer to 1ms. Frame peristence is more of an issue with VR due to motion blur-induced nausea, but for this kind of AR specifically it’s a big deal because with such tiny pixel density at the retina any minimal amount of motion blur can eiminate the achieved retina resolution by different scanlines overlapping due to eye motion and also produce a wobbly image.
    If the scanner could scan faster I’m sure they would just increase the foveal FOV first.

    • 1) Regarding the foveal projector FOV and eye-tracking. For a foveated display with a moving foveal projector to work, the eye-tracking and display positioning have to be very accurate. Otherwise, it will be putting the image in the wrong place. I know Varjo’s “static foveated display” (not moving high-resolution center) works moderately well and they don’t move the image at all and the foveal area is about 2x linearly in X and Y. The sharpest part of a person’s vision is only about 3-degrees and the key requirement it to make sure the blend area between the foveal and peripheral displays is in a low enough part of the retina that person can’t perceive it. Therefore I am more concerned about whether the person will detect a positional error (the image will be seen to wiggle or be in the wrong place) than see the boundary.

      There may also be issues with how well does the eye “blank” between saccades. I have never seen or heard of a foveated display with a moving foveal work. Those working on them have not shown them so I suspect they are not perfect. Thus EyeWay demonstrating this capability would seem to me to be a major advancement if they can do it.

      2) With a laser scanning display there is zero persistence. The EyeWay system is just a prototype so whatever they are doing now is far from final. I would think that for the peripheral display where flicker would be more apparent, they would eventually want the refresh rate to be over 100Hz (non-interlaced) to eliminate flicker perception. You raise another interesting issue for trying to prevent having a wobbly image and motion blur issues. The basic concept of the foveated display is that they could have two fast displays since they are not trying to support such high resolution.

      • 1) I agree. What I meant instead that low accuracy tracking demands higher foveal FOV so that when the real eye fovea is off due to accuracy of tracking and positioning system, it is still within the digital foveal FOV. But I agree that a bigger issue may be “breakup” of the combined image due to the foveal and peripheral portions not being perfectly aligned.
        Also don’t forget that if you aim to support only the central few degrees for the fovea, there’s still the relatively high resolution falloff outside of it. I don’t have the diagrams on hand but I think if you want a FHD peripheral image to not be noticeable you need the foveal portion to be at least 20 degrees, otherwise you will notice where the peripheral portion begins even with perfect eye tracking because your vision will detect the digital periphery being too blurry.

        2) Sorry I didn’t mean to say persistence, I meant to say the time it takes for the whole image to be displayed (scanned). From my understanding the source of motion blur from such scanned images wouldn’t be pixels themselves trailing on the retina but the scanlines ending up where the previous scanlines landed.
        This maybe could be compensated by having a frame buffer and deforming the pixel content of each scanline depending on head motion. Didn’t VR OLEDs do something similar?

        I’m not sure if with LBS you can easily increase the refresh rate by decreasing resolution. You end up scanning less due to low vertical resolution but at the same time scan more due to repeating the scan two or more times.

      • 1) I would agree that they might be too small or at least at the margin. Based on the classic graphs, at 6 degrees on the temporal side, the visual acuity is about 40% of its peak and at 6-degrees on the nasal side, it is about 25% of the peak. Then again, at only 12 degrees on the nasal side starts the blind spot. There are so many factors with the way the human visual system works, that it is hard to know without seeing it. It is also possible that different people may have different sensitivities as they do for flicker-fusion and field sequential color breakup. I also agree that they might some room to fine-tune the position “digitally” and not just with mirror positioning.

        2) There are definitely reasons to keep the image updated quickly and to track correctly. I don’t know what is implemented in AR, but I am familiar with the concept of modifying/warping the image in the output stream to keep a rendered image from being in the wrong place when responding to the user’s movement. I wrote a little about this issue in https://kguttag.com/2019/10/14/busting-sonys-ghostbusters-ar-display-and-a-little-sprite-history/

        3) Normally what drives the speed limits LBS is the horizontal (fast) scan frequency. The vertical scanning is much slower (by the number of scan lines plus blanking) and thus easier. So it is relatively easy to trade frame rate for resolution. Most of the time LBS makes a bad set of compromises that end up with temporal artifacts due to being too slow and yet still have low resolution. The other major factor is the amount the mirror tilts. The wider it tilts, the harder it is to go faster. Some such as Oqmented (https://oqmented.com/) are proposing Lissajou scanning which should have an update advantage (I have never seen one working and the website seems to be simulations), but I wonder how well they can generate an image with the massively non-linear scan process that keeps crossing itself.

    • Thanks. I will be reading the paper and watching the video. I might incorporate the information into the 3rd article in the series.

    • Thanks, it is an interesting article on the subject. I have heard from others that vision may not be totally blacked out. It is certainly difficult to verify. They basically come up with “tricks” and see if the user notices something.

      It will be interesting if EyeWay can solve all the issues and fool the eye/visual system. Note that EVERY display device at its core is an optical illusion, but that some work better than others.

  3. One should not underestimate how difficult it will be to miniaturize all of these features to a headset form factor — if it is possible at all. Creating a solution on a large optical bench is comparatively straight-forward, but optical designs and MEMS hardware do not simply scale. There is also the power consumption issue for a self-contained headset. Optimizing size, weight, power and cost are the truly difficult engineering challenges of any AR headset design.
    Engineering is the science and art of making trade-offs. I can show some really great performance in one area if I ignore all the other criteria.
    (I am an optical engineer recently retired from this field. )

Leave a Reply

%d bloggers like this: