Meta (aka Facebook) Cambria Electrically Controllable LC Lens for VAC?

Introduction – Surprise Finding on Meta Cambria

I just discovered something I have not seen reported anywhere else about what might be in Meta’s (aka Facebook) next-generation VR headset Cambria discussed last month’s Meta Connect 2021 video.  

While writing the next article on AWE’21, which will primarily discuss Lynx passthrough-AR, I started thinking about other “thin” VR-type optics. One of the main claims for Lynx’s optics was their use of a catadioptric optical module (fancy name for a combination of refractory (lens) optics and mirrors) to make a thinner passthrough AR (VR with cameras for the real world) headset. I knew that Kopin had developed “pancake optics” to make for a very small VR prototype shown by Panasonic at both CES 2020 and CES 2021. A quick search on “pancake optics” led me to a still frame image from the Meta Connect 2021 video

It was fairly obvious that the optics used a mixture of polarizers and quarter waveplates. But to better understand what they were doing, I did a quick patent application search on Facebook and pancake optics and turned up about 30 patents, of which about ten seemed relevant. A few closely related applications (with the same figures and descriptions) had a figure that looked very similar to the one from Connect 2021. But these applications had something extra in the figure, a variable focus LC element (130).

Quarter Waveplate (QWP) Retarders (Quick Background)

QWPs are commonly used in polarized optics, and most AR optical systems have them someplace to help steer the light. Pancake optics leverage QWPs, so it is good to know some basic properties of QWP to understand the optics diagrams.

Figure 3: A reflective surface switches the direction of the polarization of light and is commonly used in optical isolation techniques.
From Edmond Optics “Polymer Polarizers and Retarders”

QWPs are usually made of thin plastic films/sheets and are very transparent. They “circularly polarized” (when oriented correctly) and retard its phase by a quarter-wavelength, either left or right. If you circularly retard light by a quarter-wave in the same direction twice, effectively a “half-wave” rotation, the light will be linearly rotated by 90 degrees. Light reflected off a mirror will cause the circular polarization to go from the right or left to the opposite circular polarization (see figure from Edmond Optics on the right). But note, linearly polarized light off a mirror will stay in the same linear polarization. By one direction of linear polarized light is called “s” polarization, and the exact opposite polarization is called “p.” Also, by convention, an up or up/down arrow indicates “p” polarization, and a circle with a dot (indicating the arrow is in/out of the page) indicates “s” polarization.

Pancake Optics Kopin/Panasonic and then Meta

Kopin developed has been promoting pancake optics with their OLED microdisplays for at least the last two years. The figure below was taken from an article, Commentary on All-Plastic Pancake Optics from Kopin by Chris Chinnock. Panasonic has shown Kopin’s OLED microdisplay with pancake optics as CES 2020 and 2021 (see below right).

Pancake optics save space by having a “folded path” where the light bounces back and forth through and/or off the same elements. The lens nearest the display device has a half-mirror coating and thus acts both as a refractory lens on the transmissive first pass and then is a curved mirror with optical power when the light is reflected.

Figure 1 and 7 from Facebook’s patent application 2020/0348528 below shows the same basic structure as Kopin’s design. The lens nearest the display has a half mirror coating on one side, so it also acts as both a lens and a mirror.

The above right figure in the Facebook application shows the basic path through the pancake optics. The display (110) could be an OLED with linear polarizers followed by QWP or an LC-type display emitting circularly polarized light (or something similar). Light from the display is left circularly polarized, and 50% will pass through a lens 120 with a 50/50 partial mirror coating 122. The light will be refracted by lens 120 and then pass through a QWP 124 as it exits the lens. The QWP will change the light from left-circular to s-linear-polarization. The light then goes through the variable LC lens 130 (more on this later).

The s-polarized light is then reflected off a polarizing beam splitter on the surface of lens 140, and back through variable lens 130 and on to QWP 124 on the first lens 120, which changes the s-polarization back to left-circular. The light will then be reflected off the 50/50 mirrored surface 122, which will also cause the left-circular light to become right-circular. Since the 50/50 mirror is curved, it will also bend the light. So the element 120 acts as a lens on light going in one direction and a curve mirror on light going in the other direction.

Light exiting lens/mirror 120 goes through QWP 124, becomes linearly p-polarized, and passes through variable lens 130. The p-polarized light can then pass through the polarizing beam splitter 142 and be refracted by lens 140 as it exits toward the eye. The folded path makes the optics more compact, and element 120 works as two different elements

Meta’s Segmented Phase Profile (SPP) Variable Liquid Crystal Lens (Also DeepOptics and Magic Leap)

There is one very big difference in the Meta patent application: a variable liquid crystal (LC) lens 130. Applying a voltage across the LC will act as a variable focus Fresnel lens, as shown in the figures below.

You will notice that in figures 5&6 (above right), the Meta application also discusses the use of multiple layers of LC lenses. Thinner multiple lenses would switch faster and give more options than a single thicker lens.  LC typical switching speed is roughly proportional to the square of the LC thickness, so if the layer were ten times thinner, as discussed as a possibility in the application, it would switch about 100 times faster.

DeepOptic LC Swithcable Lens

The fact that liquid crystals can make variable focus lenses is well known in the industry. I first saw a working device with DeepOptics used with a Lumus waveguide at CES 2018 (see: CES 2018 Part 1 – AR Overview). Deep Optics has been developing LC-controlled lenses for about ten years and has started selling 32°N polarized sunglasses with electrically controlled focus (shown below left and demonstrated in a short video).

Conceptually, it looks like Meta could be combining a Kopin-like pancake with the DeepOptics-like liquid lens technology.

Also as I pointed out in Magic Leap 2 (Pt. 2): Possible Answers from Patent Applications, Magic Leap has also shown that they have considered adding variable focus LC lenses to their designs. But as I wrote in the article, I tend to doubt that in the end Magic Leap 2 could afford the additional light loss from liquid LC lenses.

Vergence Accomodation Conflict (VAC)

The vergenceaccomodation conflict.

The reason for the variable focusing is Vergence Accommodation Conflict (VAC). Quoting directly from the Meta application (with the figure on the right from Journal of Vision 2008):

Further, current VR/AR/MR HMDs are often having the so-called vergence-accommodation conflict, where a stereoscopic image pair drives the vergence state of a user’s human visual system to arbitrary distances, but the accommodation or focusing state of the user’s eyes is optically driven towards a fixed distance. The vergence-accommodation conflict causes eye strain or headaches during prolonged VR/AR/MR sessions, significantly degrading the visual experience of the users. The disclosed pancake lens assembly and optical system thereof are directed to solve one or more problems set forth above and other problems.”

Addressing VAC is nothing new and a much-discussed problem in the AR and VR design communities. Perhaps most famously, it was at the core of Magic Leap’s origins (see here and here for my discussion of Magic Leap’s attempt at addressing VAC that turned out not to work well).


I have no “sources” telling me that Meta’s Cambria will have variable focus. The only evidence I have are several patent applications that otherwise look like what Meta showed at Connect 2021 but with the addition of the variable focus element. But as Cambria is supposed to be a higher-end product and VAC is considered a major issue with VR headsets, it does seem to add up. Someone else may have made the connection first, but I did not find it in my internet searching

The Meta application’s approach is to track the eyes and adjust the focus distance the eyes perceive to agree with the vergence. From what I have read, if done well, it should largely address the headache and nausea issue. It’s not a “perfect” solution in terms of realism. Not only will the point the eyes are aimed at focus agree with the vergence, but everything else in the image, regardless of its virtual 3-D distance, will also appear to be at the same focus distance. To make it work better would require significantly more complex approaches, including focus planes (ex. Lightspace 3D), Lightfields (ex. Creal), focus surfaces (ex. Oculus Reseach)) or going all the way to (true) holograms (ex. Microsoft Siggraph 2017).

Karl Guttag
Karl Guttag
Articles: 243


  1. Hi Carl, great article. One question here: as we all know, AR has three main technology paths the industry has been exploring in the last decade, including: waveguide, Birdbath and Freeform combiners. I wonder will you describe each main technical paths’ latest progress, including their latest status and existing challenges? By organizing information by technical paths and their latest arts, it will be easier for readers to obtain a latest panoramic view of AR technology.

    i.e., has Freeform combiner dead already or they are quietly developing this AR technology under the surface?


    • Optical combiners can be categorized as waveguide and free-space. Waveguides took off because that is the best way towards glasses-like aesthetics – flat and thin. Unfortunately, with waveguide, at least currently, it is impossible to transmit, for example several focal planes – thus these remain single focal-plane devices prone to vergence-accommodation conflit. Solutions are not elegant – Magic leap had to stack two sets of waveguides to extract 2 focal planes. It might change in future (not near) with addaptive metasurfaces and other bold things – but not likely within 10 years (realistically – more).

      What goes with free-space combiners – then freeform or birdbath or flat beamsplitter with a refreactive lens – all work. With these you can support mechanisms for mitigating vergence-accommodation conflict – whether it is a type of holography, multi-focality or lightfield. BUT – these will never provide sleak aesthetics – they occupy volume and will be bulky. The difference between freeform and birdbath is just on the projection – either it is on-axis or off-axis. On axis – good for image quality – low distortions, great control. Off-axis (Freeform combiners) means distortions and degrading image quality away from the center. To avoid this – you have to use “weak” free-forms – with low optical power – meaning that you need rather large image source to achieve good field of view – so these classical free-forms tend to be very large, but sometimes it maybe approves itself. Some improvements can be expected with holographical optical elements used as optical combiners – but still – they are quite plagued by multiple problems. Bird-bath is a sraight-forward approach – produces good image quality (but sacrifices image brightness/see-through ability) – without mitigation via cotings – emits the image outwards (bright visor). You can spin everything around – but always you will have to sacrifice something… Either VAC-mitigation, Bulkiness, Image quality, Brightness…it is always something.

    • Yes, at least to a degree and I forgot to mention the concept of blurring based on the vergence. As they point out, it is more complex than it would seem to fool the eye well.

  2. This would definitely be curious because at Connect 2021 Carmack seemed to say that their eye-tracking doesn’t perform well enough across wide populations for this and it would be something to consider after they’ve released their first headset with eye-tracking (and have collected a ton of data on much larger populations).

    I do wonder how tolerable eye-tracking errors for this kind of thing will be and what percent of the population it needs to “just work” for in order for companies like Facebook/Apple to justify its inclusion.

    And although this seems a little absurd to me as someone that can’t be bothered to use VR very much due to what seems to be VAC related eyestrain, I’ve also been seeing more skepticism lately about just how much value addressing VAC actually provides and that we should just tolerate the fixed focus.

    • Thanks, that video it excellent and very timely. I’m working on a follow-up article discussing Halfdome 3.

Leave a Reply

%d bloggers like this: