Disney-Lenovo AR Headset – (Part 1 Optics)

Disney Announced Joint AR Development At D23

Disney at their D23 Fan Convention in Anaheim on July 15th, 2017 announced an Augmented Reality (AR) Headset jointly developed with Lenovo. Below is a crop and brightness enhanced still frame capture from Disney’s “Teaser” video.

Disney/Lenovo also released a video from a interview a the D23 convention which gave further details. As the interview showed (see right), the device is based on using a person’s cell phone as the display (similar to Google cardboard and Samsung’s Gear for VR).

Birdbath Optics

Based on analyzing the two videos plus some knowledge of optical systems, it is possible to figure out what they are doing in terms of the optical system. Below is a diagram of what I see them as doing  in terms of optics (you may want to open this in a separate widow to view this figure in the discussion below).

All the visual evidence indicates that Disney/Lenovo  using a classical “birdbath” optical design (discussed in an article on March 03, 2017). The name “birdbath” comes from the used of a spherical semi-mirror with a beam splitter directing light into the mirror. Birdbath optics are used because they are relatively inexpensive, lightweight, support a wide field of view (FOV), and are “on axis” for minimal distortion and focusing issues.

The key element of the birdbath is the curve mirror which is (usually) the only “power” (focus changing) element. The beauty of mirror optics is that they have essentially zero chromatic aberrations whereas is is difficult/expensive to reduce chromatic aberrations with lens optics.

The big drawbacks of birdbath optics include that they block a lot of light both from the display device and the real world and double images from unwanted reflections of “waste” light. Both these negative effects can be seen in the videos.

There would be no practical way (that I know of) to support a see-though display with a cell phone sized display using refractory (lens) optics such as used with Google Cardboard or the Oculus Rift. The only practical ways I know for supporting AR/see-through display using a cell phone size display all use curved combiner/mirrors..

Major Components

Beam Splitter – The design uses a roughly 50/50 semi-mirror beam splitter which has a coating (typically aluminum alloy although it is often called “silver”) that lets about 50 percent of the light through while acting like a mirror for 50% of the light. Polarizing beam splitters would be problematic with using most phones and are much more expensive. You should note that the beam splitter is arranged to kick the image from the phone toward the curved combiner and away from the person’s eyes; thus light from the display is reflected and then has a transmissive pass.

Combiner – The combiner, a spherical semi-mirror is the key to the optics and multiple things. The combiner appears to also be about 50-50 transmissive-mirror. The curved mirror’s first job is to all the user for focus on the phones display which otherwise would be too close to a person’s eyes to support comfortable focusing. The other job of the combiner is to combine the light/image from the “real world” with the display light; it does this with the semi-mirror allowing light from the image to reflect and light from the real world be be directed toward the eye. The curve mirror only has a signification optical power (focus) effect on the reflected display light and very little distortion of the real world.

Clear Protective Shield

As best I can tell from the two videos, the shield is pretty much clear and serves no function other than to protect the rest of the optics.

Light Baffles Between Display Images

One thing seen in the picture at top are some back stepped light baffles to keep light cross-talk down between the eye.

Light Loss (Follow the Red Path)

A huge downside of the birdbath design is the light loss as illustrated in the diagram by the red arrow path where the thickness of the arrows are roughly to scale with the relative amount of light. To keep things simple, I have assumed no other losses (there are typically 2% to 4% per surface).

Starting with 100% of the light leaving the phone display, about 50% of goes through the beam splitter and is lost while the other 50% is reflected to the combiner. The combiner is also about 50% mirrored (a rough assumption), and thus 25% (0.5 X 0.5) of the display’s light has its focus changed and reflected back toward the beam splitter. About 25% of the light also goes through the combiner and causes the image you can see in the picture on the left. The beam splitter in turn allows 50% of the 25% or only about 12.5% of the light to pass toward the eye. Allowing for some practical losses, less than 10% of the light from the phone makes it to the eye.

Double Images and Contrast Loss (Follow the Green Dash Path)

Another major problem with the birdbath optics is that the lost light will bounce around and cause double images and losses in contrast. If you follow the green path, like the red path about 50% of the light will be reflected and 50% will pass through the beamsplitter (not shown on the green path). Unfortunately, a small percentage of the light that is supposed to pass through will be reflected by the glass/plastic to air interface as it tries to exit the beamsplitter as indicated by the green and red dashed lines (part of the red dashed line is obscured). This dashed path will end up causing a faint/ghost image that is offset by thickness of the beamsplitter tilted at 45 degrees. Depending on coatings, this ghost image could be from 1% to 5% of the brightness of the original image.

The image on the left is a crop from a still frame from the video Disney showed at the D23 conference with red arrows I added pointing to double/ghost images (click here for the uncropped image). The demo Disney gave was on a light background and these double images would be even more noticeable on a dark background. These same type of vertically offset double image could be seen in the Osterhaut Design Group (ODG) R8 and R9 headsets that also use a birdbath optical path (see figure on the right).

A general problem with the birdbath design is that there is so much light that is “rattling around” in an optical wedge formed by the display surface (in this case the phone), beamsplitter, and combiner mirror. Noted in the diagram that about 12.5% of the light returning from the combiner mirror reflected off the beam splitter is heading back toward the phone. This light is eventually going to hit the front glass of the phone and while much of it will be absorbed by the phone, some of it is going to reflect back, hit the beam splitter and eventually make it to the eye.

About 80% of the Real World Light Is Blocked

In several frames in the D23 interview video it was possible to see through the optics and make measurements as to the relative brightness looking through and around the optics. This measurement is only rough and and it helped to take it in several different images. The result was that about a 4.5 to 5X difference in brightness looking through the optics.

Looking back at the blue/center line in the optical diagram, about 50% of the light is blocked by the partial mirror combiner and then 50% of that light is block by the beam splitter for a net of 25%. With other practical losses including the shield, this comes close to the roughly 80% (4/5ths) of the light being block.

Is A Cell Phone Bright Enough?

For Movies in a dark room ANSI/SMPTE 196M spec for movies recommends about about 55 nits in a dark room. A cell phone typically has from 500 to 800 peak nits (see Displaymate’s Shootouts for objective measurements), but after about a 90% optical loss the image  would be down to between about 50 and 80 nits, which is possible just enough if the background/room is dark. could be acceptably bright in a moderately dark room.  But if the room light are on, this will be at best marginal even after allowing for the headset blocking about 75 to 80% of the room light between the combiner and the beam splitter.

With AR you are not just looking at a blank wall. To make something look “solid” non/transparent the display image needs to “dominate” by being at least 2X brighter than anything behind it. It becomes even more questionable that there is enough brightness unless there is not a lot of ambient light (or everything in the background is dark colored or the room lights are very dim).

Note, an LCOS or DLP based see-through AR systems can start with about 10 to 30 times or more the brightness (nits) of a cell phone. They do this so they can work in a variety of light conditions after all the other light losses in a system.

Alternative Optical Solution – Meta-2 “Type”

Using a large display like a cell phone rather than microdisplay severely limits the optical choices with a see-through display. Refractive (lens) optics, for example, would be huge and expensive or Fresnel optics with their optical issues.

Meta-2 “Bug-Eye” Combiners

The most obvious alternative to the birdbad would be to go with dual large combiners such as the Meta-2 approach (see left). When I first saw the Disney-Lenovo design, I even thought it might be using the Meta-2 approach (disproven on closer inspection). With Meta-2, the beam splitter is eliminated and two much larger semi-circular combiners (givening a “bug-eye” look) have a direct path to the display.  Still the bug-eyed combiner is not that much larger than the shield on the Disney-Lenovo system. Immediately, you should notice how the user’s eyes are visible which shows how much more light is getting through..

Because there is no beamsplitter, the Meta-2 design is much more optically efficient. Rough measurements from pictures suggest the Meta-2’s combiners pass 60% and thus reflects about 40%. This means with the same display, it would make the display appear 3 to 4 times brighter while allowing about 2.5X of the real world light through as that of the Disney-Lenovo birdbath design.

I have not tested a Meta-2 nor have read any serious technical evaluation (just the usual “ooh-wow” articles), and I have some concerns with the Meta design. The Meta-2 is “off-axis” in that the display is not perfectly perpendicular to the the combiner. One of the virtues of the birdbath is that is it results in a straightforward on-axis design. With the off-axis design, I wonder how well the focus distance is controlled across the FOV.

Also, the Meta-2 combiners are so far from the eye, that a persons two eyes would have optical cross-talk (there is nothing to keep the one eye from seeing what the other eye is seeing such as the baffels in the Disney-Lenovo design). I don’t know how this would affect things in stereo use, but I would be concerned.

In terms of simple image quality, I would think it would favor the single bug-eye style combiner. There are are no secondary reflections caused by the beamsplitter and both the display and the real world would be significantly brighter. In terms of cost, I see pro’s and con’s relative to each design and overall not a huge difference assuming both designs started with a cell phone displays. In terms of weight, I don’t see much of a difference either.

Conclusions

To begin with, I would not expect even good image quality out of a phone-as-a-display AR headset. Even totally purpose built AR display have their problems. Making a device “see-through” generally makes everything more difficult/expensive.

The optical design has to be compromised right from the start to support both LCD and OLED phones that could have different sizes. Making matters worse is the birdbath design with its huge light losses. Add to this the inherent reflections in the birdbath design and I don’t have high hopes for the image quality.

It seems to me a very heavy “lift” even for the Disney and Star Wars brands. We don’t have any details as to the image tracking and room tracking but I would expect like the optics, it will be done on the cheap. I have no inside knowledge, but it almost looks to me that the solution was designed around supporting the Jedi Light Saber shown in the teaser video (right). They need the see-through aspect so the user can see the light saber. But making the headset see-through is a long way to go to support the saber.

BTW, I’m a big Disney fan from way back (have been to the Disney parks around the world multiple times, attended D23 conventions, eaten at Club 33, was a member of the “Advisory Council” in 1999-2000, own over 100 books on Disney, and the one of the largest 1960’s era Disneyland Schuco monorail collections in the world ). I have an understanding and appreciation of Disney fandom, so this is not a knock on Disney in general.

Karl Guttag
Karl Guttag
Articles: 256

4 Comments

    • Thanks, This looks like another “birdbath” design, but unlike the Disney-Lenovo one that uses at Phone, this one appears to be using microdisplays and has “only” a 40 degree FOV. It is currently axiomatic that if the design has a wide (>60 degree) FOV they are using a larger flat panel technology and if it has less than 60 degree FOV it is using a microdisplay. At 40 degrees, I’m guessing it is using around a 720p device per eye, most likely LCOS if they want to keep it a “consumer cost” device.

      This looks in many ways like a lower cost version of the ODG R8/R9.

  1. I have purchased it, and much of what you suspect is confirmed. Your diagram looks very accurate to my uneducated eye

    All the issues you mention are present, a slight seeing of double, lots of errant reflections (it isn’t just visual- some of them affect the content- you’ll see reflections of enemies on the peripheral edges that aren’t there and confuse them with ones that are), and the darkness.

    One thing I do notice is that the darkness does help w/ occlusion issues you get from mobile screen AR; you don’t notice much that the characters are behind walls or furniture because your eyes just focus on the characters and the rest is dark. I tried it in a small room and barley noticed most of the characters were behind a wall.

    It actually feels closer to a VR experience than a AR one, there is a certain immersive “shut out the world” the device invites. The ‘blinder’ effect of not seeing what is outside the FOV removes the constant reminder it is all fake, much better for a cool demo to the uninitiated than the Hololens in this regard, where seeing the edges of what is AR vs what is not is a constantly taking you out of experience.

    Owning both the Mira and this, I can see why they went more with this style for a entertainment product (vs how you’d use a see through display for productivity or daily enhancements), the objects feel more 3D and solid vs being a reflection. The characters have a decent sense of presence.

    There is a lot to improve with better tracking (I had to throw a sheet over my shiny hardwood floors or it jumped all over) , the controller has to be “re-centered” for calibration every 30 seconds, etc, but it is impressive for $200 introduction to an AR entertainment headset. In many ways, it is the best ‘self-contained retail VR’ experience I’ve had; a pretty good first taste for consumers to try.

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading