304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
I just discovered something I have not seen reported anywhere else about what might be in Meta’s (aka Facebook) next-generation VR headset Cambria discussed last month’s Meta Connect 2021 video.
While writing the next article on AWE’21, which will primarily discuss Lynx passthrough-AR, I started thinking about other “thin” VR-type optics. One of the main claims for Lynx’s optics was their use of a catadioptric optical module (fancy name for a combination of refractory (lens) optics and mirrors) to make a thinner passthrough AR (VR with cameras for the real world) headset. I knew that Kopin had developed “pancake optics” to make for a very small VR prototype shown by Panasonic at both CES 2020 and CES 2021. A quick search on “pancake optics” led me to a still frame image from the Meta Connect 2021 video
It was fairly obvious that the optics used a mixture of polarizers and quarter waveplates. But to better understand what they were doing, I did a quick patent application search on Facebook and pancake optics and turned up about 30 patents, of which about ten seemed relevant. A few closely related applications (with the same figures and descriptions) had a figure that looked very similar to the one from Connect 2021. But these applications had something extra in the figure, a variable focus LC element (130).
QWPs are commonly used in polarized optics, and most AR optical systems have them someplace to help steer the light. Pancake optics leverage QWPs, so it is good to know some basic properties of QWP to understand the optics diagrams.
QWPs are usually made of thin plastic films/sheets and are very transparent. They “circularly polarized” (when oriented correctly) and retard its phase by a quarter-wavelength, either left or right. If you circularly retard light by a quarter-wave in the same direction twice, effectively a “half-wave” rotation, the light will be linearly rotated by 90 degrees. Light reflected off a mirror will cause the circular polarization to go from the right or left to the opposite circular polarization (see figure from Edmond Optics on the right). But note, linearly polarized light off a mirror will stay in the same linear polarization. By one direction of linear polarized light is called “s” polarization, and the exact opposite polarization is called “p.” Also, by convention, an up or up/down arrow indicates “p” polarization, and a circle with a dot (indicating the arrow is in/out of the page) indicates “s” polarization.
Kopin developed has been promoting pancake optics with their OLED microdisplays for at least the last two years. The figure below was taken from an article, Commentary on All-Plastic Pancake Optics from Kopin by Chris Chinnock. Panasonic has shown Kopin’s OLED microdisplay with pancake optics as CES 2020 and 2021 (see below right).
Pancake optics save space by having a “folded path” where the light bounces back and forth through and/or off the same elements. The lens nearest the display device has a half-mirror coating and thus acts both as a refractory lens on the transmissive first pass and then is a curved mirror with optical power when the light is reflected.
Figure 1 and 7 from Facebook’s patent application 2020/0348528 below shows the same basic structure as Kopin’s design. The lens nearest the display has a half mirror coating on one side, so it also acts as both a lens and a mirror.
The above right figure in the Facebook application shows the basic path through the pancake optics. The display (110) could be an OLED with linear polarizers followed by QWP or an LC-type display emitting circularly polarized light (or something similar). Light from the display is left circularly polarized, and 50% will pass through a lens 120 with a 50/50 partial mirror coating 122. The light will be refracted by lens 120 and then pass through a QWP 124 as it exits the lens. The QWP will change the light from left-circular to s-linear-polarization. The light then goes through the variable LC lens 130 (more on this later).
The s-polarized light is then reflected off a polarizing beam splitter on the surface of lens 140, and back through variable lens 130 and on to QWP 124 on the first lens 120, which changes the s-polarization back to left-circular. The light will then be reflected off the 50/50 mirrored surface 122, which will also cause the left-circular light to become right-circular. Since the 50/50 mirror is curved, it will also bend the light. So the element 120 acts as a lens on light going in one direction and a curve mirror on light going in the other direction.
Light exiting lens/mirror 120 goes through QWP 124, becomes linearly p-polarized, and passes through variable lens 130. The p-polarized light can then pass through the polarizing beam splitter 142 and be refracted by lens 140 as it exits toward the eye. The folded path makes the optics more compact, and element 120 works as two different elements
There is one very big difference in the Meta patent application: a variable liquid crystal (LC) lens 130. Applying a voltage across the LC will act as a variable focus Fresnel lens, as shown in the figures below.
You will notice that in figures 5&6 (above right), the Meta application also discusses the use of multiple layers of LC lenses. Thinner multiple lenses would switch faster and give more options than a single thicker lens. LC typical switching speed is roughly proportional to the square of the LC thickness, so if the layer were ten times thinner, as discussed as a possibility in the application, it would switch about 100 times faster.
The fact that liquid crystals can make variable focus lenses is well known in the industry. I first saw a working device with DeepOptics used with a Lumus waveguide at CES 2018 (see: CES 2018 Part 1 – AR Overview). Deep Optics has been developing LC-controlled lenses for about ten years and has started selling 32°N polarized sunglasses with electrically controlled focus (shown below left and demonstrated in a short video).
Conceptually, it looks like Meta could be combining a Kopin-like pancake with the DeepOptics-like liquid lens technology.
Also as I pointed out in Magic Leap 2 (Pt. 2): Possible Answers from Patent Applications, Magic Leap has also shown that they have considered adding variable focus LC lenses to their designs. But as I wrote in the article, I tend to doubt that in the end Magic Leap 2 could afford the additional light loss from liquid LC lenses.
The reason for the variable focusing is Vergence Accommodation Conflict (VAC). Quoting directly from the Meta application (with the figure on the right from Journal of Vision 2008):
“Further, current VR/AR/MR HMDs are often having the so-called vergence-accommodation conflict, where a stereoscopic image pair drives the vergence state of a user’s human visual system to arbitrary distances, but the accommodation or focusing state of the user’s eyes is optically driven towards a fixed distance. The vergence-accommodation conflict causes eye strain or headaches during prolonged VR/AR/MR sessions, significantly degrading the visual experience of the users. The disclosed pancake lens assembly and optical system thereof are directed to solve one or more problems set forth above and other problems.”
Addressing VAC is nothing new and a much-discussed problem in the AR and VR design communities. Perhaps most famously, it was at the core of Magic Leap’s origins (see here and here for my discussion of Magic Leap’s attempt at addressing VAC that turned out not to work well).
I have no “sources” telling me that Meta’s Cambria will have variable focus. The only evidence I have are several patent applications that otherwise look like what Meta showed at Connect 2021 but with the addition of the variable focus element. But as Cambria is supposed to be a higher-end product and VAC is considered a major issue with VR headsets, it does seem to add up. Someone else may have made the connection first, but I did not find it in my internet searching
The Meta application’s approach is to track the eyes and adjust the focus distance the eyes perceive to agree with the vergence. From what I have read, if done well, it should largely address the headache and nausea issue. It’s not a “perfect” solution in terms of realism. Not only will the point the eyes are aimed at focus agree with the vergence, but everything else in the image, regardless of its virtual 3-D distance, will also appear to be at the same focus distance. To make it work better would require significantly more complex approaches, including focus planes (ex. Lightspace 3D), Lightfields (ex. Creal), focus surfaces (ex. Oculus Reseach)) or going all the way to (true) holograms (ex. Microsoft Siggraph 2017).