304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
Part 1 of EyeWay Vision: Foveated Laser Scanning Display gave an overview of EyeWay’s technology. As I will discuss more in the next article (Part 3), EyeWay is one of the most technically interesting developments I have seen in Augmented Reality. While they are far from being done, they are working on multiple important issues.
In this part, I want to share some high-resolution photographs of EyeWay’s display. As discussed in Part 1, these photographs were provided by EyeWay directly out of the camera. I did a little cropping and converted them from Sony Camera RAW to JPEGs to make them easy to view. In addition to viewing the display with my own eyes, I took some similar pictures. Due to some mechanical interference from the prototype, the pictures from EyeWay came out a little better, so I am using them here.
In these camera pictures, you will see things you won’t see when viewing the display live. EyeWay is working on a foveal display where the high-resolution projected image will track the eye. As we will see in these pictures, the high-resolution foveal projected image behaves differently from the peripheral image.
I forgot to mention in Part 1 is that because EyeWay’s foveated eye-tracking will move both the foveal and peripheral projections, the eye-tracking foveal display movement increases the effective/perceivable FOV. This should effectively add about 20-degrees to the perceived FOV.
In Part 1, I discussed the size of EyeWay’s foveal projected image and questioned whether it was big enough to avoid detection of the transition region. EyeWay has since assured me that their experiments show that the foveal area is larger than necessary.
The EyeWay system is a prototype that has gone through many demos, handling, and shipping. There are some dust and scratches with the unprotected optics. We are getting a peek at work in progress.
I would also like to include below a comment on subject of EyeWay’s Eye Tracking by Kenneth Holmqvist. It includes links to a study of an earlier version of EyeWay’s eye tracking.
You may be interested to know that we have tested an earlier version of the EyeWay eye tracker against the DPI and the Tobii Spectrum. It scores better in most tasks relevant for an eye tracker controlling the direction of the laser scanner. You can find our paper here: https://dl.acm.org/doi/pdf/10.1145/3379155.3391330
There is also a video of my presentation at ETRA of the same validation study: https://youtu.be/7Sa31F8esTo
Below is a picture from a Sony camera. It captured about 90% of the prototype’s horizontal and vertical FOV. I have marked on the image two types of spots. Some of the spots (red Below is a picture from a Sony camera. It captured about 90% of the prototype’s horizontal and vertical FOV. I have marked on the image two types of spots. Some of the spots (red ovals and numbered) are caused by dust on the camera lens and/or sensor. Camera spots stayed in the same place from photograph to photograph were as dust in EyeWay’s optics moved relative to the camera.
The key thing to note is that there are no spots of any kind (optics or camera) in the region of the foveal projected image. This proves that the foveal projected image non-Maxwellian has light rays that appear to come from a given distance. EyeWay can change this apparent focus distance to prevent vergence-accommodation conflict (VAC).
The foveal display is “non-Maxwellian” and does not cast shadows of any dust on the optics, the camera (or floaters in your eye). With a foveal display any spots should be outside the eye’s ability to see them.
The image below is a crop of the full image, so you can see the foveated projected image clearly without opening the whole image. Note how there are no dust spots in the foveal display as outlined by the dotted line.
I have seen many laser-scanned images, both for near eye and on-screen, and EyeWay’s foveal image is the first laser projected image that didn’t look like it was laser projected. It also does not look like it came from a fixed pixel display that introduces jaggies/aliasing. EyeWay clearly uses a much better scanning process than I have seen with other lasers projectors in the foveal and peripheral regions.
In terms of resolution, EyeWay claims they are generating about 60 pixels per degree in the foveal region, based on the number of scan lines. I would judge the effective resolution to be closer to 40 pixels per degree which should be more than enough for typical AR applications.
Below are a set of high-resolution images without my markings. As discussed previously, the prototype has been used quite a bit and has dust spots in the projection path. Maxwellian direct laser scanning (the peripheral display) casts shadows from anything in the laser path. With a fully working foveal display, even a large dust spot seen here would not be perceivable by the eye.
I find the picture of the Women below does perhaps the best job of demonstrating EyeWay’s potential. In particular, I like looking at the detail in the eye on the left below (foveal projection).
The next image shows the effect of the foveated display in the small image below (click on the image for the high-resolution picture). With the busyness of the picture, it is harder to see the transition from foveal to peripheral regions.
Below is another colorful picture.
The picture below demonstrates once again the detail given by the foveal display and the richness of the colors.
In EyeWay, Part 3 plan to go through why I think EyeWay’s approach is technically very interesting. EyeWay is working on key problems that other AR design approaches probably cannot solve.
1) What do you mean when you say that the “effective” PPD will be closer to 40 than 60?
2) If that’s the case how is this any better than 2K x 2K LCoS without eye tracking covering 50×50 degree FOV?
3) If the foveal and peripheral parts use the same lasers, any ideas why there is brightness difference between the two sections? I’d imagine it would be fairly easy to calibrate even on a prototype.
EyeWay did not provide test charts of known resolution (such as the ones on this blog). I have looked at the same image on other devices with known resolutions. Scanning systems don’t have “pixels” per see, so you have to do more of a judgment comparison. The EyeWay seems to have less resolution than those with 60PPD and closer to 40PPD but better than 30PPD.
It is not in it current form, but it shows a lot of growth potential. With the foveate eye-tracking, they should give the effect of more than a 70-degree FOV as the display’s FOV expands as the eye moves. I plan to get into these issues more in my next article.
The key issue for a foveated display is the transition region and its ability to sense it within the human visual system.
There are multiple reasons why the two laser scanning regions won’t match. First, there is the issue that lasers vary significantly in brightness and wavelength from device to device. Then there is the issue that the peripheral display covers a much larger area than the foveal display and thus must be brighter at any given period of time. The two projectors have different optics that are manipulating the beams to different diameters. Then you have the whole issue with laser scanning that you use up a lot of the control of the lasers in compensating for the scanning processes (the scanning beam is constantly changing velocity, and thus, for the same brightness per unit area, the brightness of the beam must constantly be changing).
Regarding the FOV point, I don’t see how this has growth potential over other technologies. If you have two mirrors to steer the image to where you are looking at, the overall steerable FOV is (a) still dictated by the waveguide or other eyepiece optics involved and (b) not exclusive to LBS image source.
For reference the Tobii, SMI and Pupil Labs eye trackers used for foveated rendering usually have a much larger foveal section on the image. This is also true for the foveated rendering papers by Microsoft. We are talking about 20-30 degrees foveal section followed by a wide transition region, the near-peripheral section.
Making a much smaller but accurate and fast 5-6 degree foveal section is the holy grail for VR as well as it will allow VR to run PC 3d realtime graphics on mobile chipsets without compromises. I have yet to see something promising here.
Even if the EyeWay eye tracker performs better than the above trackers, it is still to be seen whether it is good enough for a tiny 6 degree foveal section.
Foveate rendering, while in some ways similar, has significantly different precision requirements and trade-offs. Rendering can get away with much looser tracking of the eye. Simple pupil tracking is more than sufficient, and the only “penalty” for sloppier tracking is bigger foveated and transition regions.
I doubt that just tracking the pupil will be sufficient for a foveated display. EyeWay’s tracking goes all the way to tracking the retina. I think this may be essential to get the foveal image to “lock” and not appear to be moving when it should not be.
Let’s say the EyeWay is wrong, and the foveal image needs to be bigger to avoid noticing the transition region. There is nothing to say they could not make it bigger in the future. I am more interested in their ability to precisely track the eye and how the eye perceives the movement of the image in the foveated region.
Hey Karl, the test images you show appear to have black regions; am I to assume this is all shot on a very dark field? There’s nothing new under the sun (or on the retina) that would allow LBS systems in general or EyeWay in particular to generate imagery that’s darker than its background… right?
Impressive overview as always, thanks for making this knowledge public!
You are generally correct. The background is not absolutely dark but the ambient is not nearly as bright as the display. The eye has a much larger dynamic range than the camera.
There is no “magic” way with lasers to project black. Just like other displays, they can only add light.