Snap Spectacles at CES, AR/VR/MR, & AWE 2025, Consumer Product in 2026

Introduction

Snap has been making the rounds in 2025 in preparation for the launch of its consumer AR glasses in 2026. Snap introduced the Snap Spectacles 5 (I will abbreviate as S5) developer kit model in October 2024.

While I was invited to Snap’s unveiling of the S5 in September, I was unable to attend. This blog recorded a roundtable discussion about the S5 about a month after its introduction (see: Snap Spectacles 5 and Meta Orion Roundtable Video Part 1).

At CES in January 2025, Snap not only allowed me to try on the S5, but they also let me take pictures through the optics and supported me by uploading test patterns I sent them. Snap was remarkably open compared to the other large companies that typically show nearly scripted demos, primarily to select media people with little technical knowledge of AR displays.

At SPIE’s AR/VR/MR conference, Snap presented more information about the S5, and I will include information from that presentation in this article.

At AWE 2025, they took over the indoor balcony that was the only pathway to the Main Stage. It was not so much a booth as a “land” with multiple booths, including a demo room (see the set of pictures above right). Everything from the escalators to posts to guard rails was wrapped in Snap yellow. The demo, which consisted of a game that utilized hand tracking, was open for anyone to try.

Attending and Speaking at AR/VR Connect in Eindhoven, Netherlands, September 24th – 25th (and Discount Code)

In a little over a month, I will be speaking at the combined ARVR and MicroLED Connect conference in Eindhoven, Netherlands, from September 24th to 25th. If you plan to attend and would like to meet with me, please send an email to meet@kgontech.com.

The Conference is offering my readers a €150 discount if they use the discount code KarlARVR. This discount code is valid for both the “Virtual Pass” and “Hybrid Pass” for one year, applicable to both virtual and physical conferences. The discount code (or mentioning this blog) is also good for a 10% discount on exhibition packages.

Background On Snap’s AR Acquisitions

I’ve been writing about Snap’s AR developments in several articles over the years. Before that, I wrote about WaveOptics, which was acquired by Snap in 2021. I was also very familiar with LCOS and MicroLED backplane developer, Compound Photonics, which was bought by Snap in 2022, and which I have known about since its founding in 2007. Snap also hired some key people, including former Daqri CTO Daniel Wagner, who worked at AR helmet maker Daqri after Daqri folded in 2019. I was very familiar with Daqri’s headset and had met with them through the years. In an additional Daqri connection, David Hayes, former COO of Daqri, was CEO of Wave Optics from 2017 until Snap acquired it.

WaveOptics

In 2016, I first met with WaveOptics co-founder and CEO Sumanta Talukdar. In 2019, I met with WaveOptics’ CEO, David Hayes, at WaveOptics UK headquarters. Also in 2019, I acquired a DLP-based WaveOptics development kit. I had also written about Raontech’s use of WaveOptics’ waveguides at CES 2018 and CES 2019.

While most other diffractive waveguides use three 1-D gratings (entrance, expansion, and exit), WaveOptics uses a two 2-D gratings (entrance and expansion/exit) structure, which results in a distinct triangular pattern in its eye glow (right). This pattern led to my writing Exclusive: Snap Spectacles Appears to Be Using WaveOptics and [an LCOS] a DLP Display in May 2021. About one hour after I published this article, Snap announced that it had bought WaveOptics.

It is also interesting that 3-D prescription lens maker Luxexcel was bought by Meta a few months before Snap bought WaveOptics.

Meta Orion Using Snap/WaveOptics Type Diffractive Waveguide

While WaveOptics’/Snap’s 2-D expansion structure used to be unique and distinctive, it appears that other companies are now using similar diffractive waveguide structures. I wrote about Meta’s Orion using a 2-D expanding waveguide in Meta Orion AR (Pt. 2 Orion vs Wave Optics/Snap and Magic Leap Waveguides) and Meta Orion AR (Pt. 2B Corrections On Two-Sided Waveguides), both in October 2024 (see the articles and figures below).

Compound Photonics (CP) – LCOS and MicroLEDs

I have a longer history with Compound Photonics (CP) than this blog has existed. CP has bought the LCOS design and manufacturing assets of Syntax-Brillian (formerly Three-Five Brillian), and Three-Five had previously bought the LCOS assets of Colorado Microdisplay (CMD). CMD was founded in 1996 to make LCOS-based near-eye display headsets. CMD had just changed its name to Zight in 2002 before being acquired by Three-Five Brillian. All this history is important, as the only volume product I know of CP ever shipping was one of CMD’s LCOS devices (which Brillian had acquired and continued to make).

CP was founded to make a radically different product using LCOS illuminated by lasers to drive/optically-address Compound Photonic Crystals devices (thus the name Compound Photonics – right) to make large-screen projectors to deal with etendue issues with arc-lamp projectors.

CP then pivoted to focusing on LCOS devices. CP was both developing CMOS backplanes and LCOS assembly (combining the CMOS wafers with cover glass and liquid crystal), as well as making LCOS projector engines.

Snap Owns the Optical Path from Display to Waveguide

CP was sampling, but never went to volume manufacturing. They produced one of the best LCOS devices, with a 3-micron pixel, good contrast, and a high field sequential rate, that was used in Lumus’s 2K by 2k Maximus prototype glasses (see: Exclusive: Lumus Maximus 2K x 2K Per Eye, >3000 Nits, 50° FOV with Through-the-Optics Pictures).

CP was also using the CMOS backplane and assembly expertise to make a CMOS backplane and develop MicroLED assembly with Plessey’s MicroLED wafers before Meta bought Plessey’s MicroLED capacity. Some industry insiders believe CP’s MicroLED assembly drove Snap’s acquisition of CP as much as it did for their LCOS. See also: Exclusive: Snap Buying Compound Photonics (LCOS and MicroLED)

The CP acquisition gave Snap control over its LCOS, including LCOS assembly and projection optics, as well as access to MicroLED MicroDisplay CMOS backplanes and MicroLED assembly (flip-chipping and bonding MicroLEDs onto the CMOS control backplane). WaveOptics gives them waveguides. I have speculated that at least part of the reason Snap bought WaveOptics and CP was defensive in order to keep them from being bought by other companies, particularly Meta. Snap wouldn’t want to develop an AR headset product knowing that companies making key components could be bought from under them. Note also that in the same time frame, Google bought MicroLED developer Raxium. As far as I am aware, Snap is the only AR company to own the whole optical path from display to engine to waveguide.

Spectacles 5 (S5) is a Developer Device, Not A Consumer Product

I want to emphasize that S5 is not a consumer product; it is a device for developers. This might explain why the design appears to be outdated in some respects and bulky. In many ways, the S5’s optical path seems to be based on the WaveOptics 2021 design. The design focus of the S5 seemed more on other hardware features like tracking/SLAM. Snap’s eventual consumer product in 2026 will likely benefit from five years of development and could be completely redesigned.

I had a brief, amicable discussion at AWE with Scott Myers, Snap’s VP of Hardware Engineering, where he (to be expected) politely declined to share any information on how the consumer device would be different from the S5. His response to my questions on the waveguide and the LCOS device was that “Snap was very pleased with how its acquisition of WaveOptics and Compound Photonics was progressing.”

Snap Spectacles 5 – Old Design but With LCOS

Snap Spectacles 4 in 2021 looks like they took a circa 2019 WaveOptic development kit using DLP displays and rotated the waveguides 90 degrees (see: https://kguttag.com/2021/05/21/exclusive-snap-spectacles-appears-to-be-using-waveoptics-and-an-lcos-display/). WaveOptics was openly saying that they were transitioning to using LCOS (rather than DLP) as their main display device before Snap acquired them to improve resolution, reduce cost, and lower power. It took until S5 for Snap to make the switch to LCOS as the display. Compare the WaveOptics development kit (above right) to S5 (below right).

As I discussed in Disparity Correction Has Become Important All of a Sudden – Plus some on Binocular Color Difference (Snap & JBD), rotating the waveguides by 90 degrees improves the binocular color uniformity somewhat, as the color variation with diffractive waveguides tends to get worse across the waveguide.

A mechanical issue with the current rotated WaveOptics type waveguide is the distance from the entrance grating to the combined expansion and exit grating (see left). This issue is likely not an isolable issue, as Meta has demonstrated using a similar 2-D expander in their Orion (lower left). The current S5 design results in very wide glasses (right).

The temples on the S5 are also extremely large compared to most other AR glasses I have seen to date. They seem to be hiding a rather large light engine for the LCOS device. In looking at Snap’s recent patent applications, I found the recently filed US 20250237877 with a diagram of an LCOS light engine (right). The configuration of this engine with the Red/Blue and Green LED combiner and a polarizing beam splitter looks similar to the LCOS engine designs I saw nearly 20 years ago. I don’t know if this design is in the S5, but something like this would help explain the larger temples (and I didn’t see any smaller designs in a quick search of Snap’s patent applications). I recorded a video (see here) after AWE 2022 discussing some more innovative/smaller LCOS engines.

Spectacles 5 Bulky Design

Between the Waveguide’s long distance from the entrance to exit grating and what I think it a large LCOS projector engine, it forces an overall bulky looking form factor (below). I have annotated in red some of the key elements I’ve identified in the design.

I’m not certain, as this is a 3-D model and not a tear-down. Compounding my uncertainty is that a slide presented by Snap AR/VR/MR 2025 (right) has a yellow dotted line indicating the waveguide is in a different location. Still, that location does not make sense to me as the waveguide is usually the layer that directly connects to the display engine output. So I think it is an error on their Snap’s slide. The way I labeled it below, I was trying to identify the Push and Pull lenses, the dimmer, and the waveguide, and the order I labeled them is the only one that makes sense.

Cameras, 6DOF, and Reprojection to Reduce LCOS Field Sequential Color Breakup

The S5 has four cameras. The top two, on the left and right, can be used for picture-taking, including making stereo images (I would assume). Two cameras face down at an angle, likely dedicated to gesture recognition and 6DOF.

As seen in the side (right) taken from Snap’s AR/VR/MR 2025 presentation (right), the four cameras combined with IMU information are used for 6DOF tracking.

Something else I glean from Snap’s slide is that they are likely using motion detection and prediction to “reproject” the displayed image at the 360 Hz LCOS color field rate. This is a late-stage re-rendering/warping to keep the color fields, which occur sequentially in time over 1/60th of a second, aligned as the head moves. Re-rendering at the color field rate is known to significantly reduce the perceived field sequential color breakup in AR headsets, and many companies have employed it when using field sequential color.

Push-Pull Lenses

Snap at their AR/VR/MR 2025 (left) said that the S5 had push-pull lenses. Typically, waveguides output light focused at infinity, and the pull lens (between the eye and the waveguide) moves the focus from infinity to about 2 meters using about a -1/2 diopter correction. The “Pull” lens is there to cancel out the effect of the Push lens on the real-world view, so the real world is not distorted or out of focus.

More integrated AR glasses are incorporating prescriptions into the “pull” lens so it corrects for both the waveguide’s virtual image and the real-world view. Currently, the S5 requires prescription inserts for those who need vision correction.

While putting push-pull lenses around the waveguide addresses the basic Vergence Accomodation Conflict (VAC – see: Vergence Accommodation Conflict or Variable Focus) issue associated with waveguides outputting light that focuses at infinity, because it is fixed, it does not address the dynamic issue with virtual objects being at different distances in 3-D space. In particular, the biggest common issue not addressed is when the user is working on something with their hands, in which case it is desirable to have the focus/accommodation distance appear to be about half a meter (about one and a half feet).

Skull Grabber Temple Tips

The ends of the temples (temple tips) of the S5 have what I have dubbed “skull grabbers.” The human skull is moderately round and smooth. There is no place for the temple tips to gain purchase (grip) unless they are extended long enough to start wrapping around the occipital part of the skull, which bulges out a bit. When this is done, often companies put the batteries in the skull grabbers to better balance the weight of the glasses. In my experience, skull-grabbing temples are only moderately effective in keeping AR glasses comfortably secure and without transferring too much weight to the nose. The advantage over an open helmet/headband approach is that the skull grabber design allows the glasses to be put on from the front and is more compatible with more hairstyles, and still looks similar to a traditional glasses form factor (see: Skull-Gripping “Glasses” vs. Headband or Open Helmet).

Global Dimming

The S5 has global dimming to support use in outdoor daylight (see right, from Snap slide). I don’t know whether they are using a polarizing or a non-polarizing dimming technology (I forgot to check — I would hope non-polarizing). Polarizing dimming typically blocks 60% or more of the light in the most transmissive state and causes problems when looking at LCD-based TVs and monitors. Non-polarizing dimming can block as little as 20% of light in its most transmissive state.

Light Capture Artifacts

A common issue with waveguides is capturing light from the outside world and directing it into the eyes. This issue with diffractive waveguides is commonly called a “rainbow effect.” Meta made eliminating rainbow artifacts a major goal for their Orion prototype (see: Bosworth says “Nearly Artifact Free” and with Low “Rainbow” capture).

Forward Projection (Eye Glow)

WaveOptics waveguides have always had substantial forward projection, commonly referred to as “eye glow.” They have a distinct pattern with a triangular expansion area next to the rectangular exit grating. I should note that Meta’s Orion, which uses a similar waveguide structure, also has significant eye glow (see: Meta Orion (Pt. 3 Response to Meta CTO on Eye Glow and Transparency))

Eye glow is seen as a social problem. Several diffractive waveguide companies, including Digilens, Vuzix, and Dispelix, have designed waveguides to reduce front projection and direct the projected light downward, where most people won’t see it. But all of these companies that are controlling front projection all use three 1-D expanding gratings. WaveOptics/Snap and Meta’s Orion use 2-D expanding diffraction gratings. It makes me wonder if it is possible to control the front projection with this type of waveguide architecture.

Resolution, FOV, and Pixels Per Degree (PPD)

The S5 has 1008 horizontal x 1398 vertical visible pixels (yes, taller than wide) or about a 3:4 aspect ratio. Likely, there are about 200 pixels held in reserve horizontally for dealing with IPD and otherwise positioning the image relative to the eye. It has a 46-degree FOV and supports 37 pixels per degree (PPD). I think it would be better to have the aspect ratio to be 4:3 (wider than tall).

The PPD, I think, is nearly ideal for an AR headset. More than 40 PPD would have very diminishing returns. Much less than 40 PPD makes text chunky and slower to read and requires too much blocking of the field of view for a given amount of text; S5’s 37 PPD is very near the sweet spot. My one concern, as will be discussed in the next section, is that the current LCOS thins out white/light text.

FOV Impact on Glasses Form Factor

I have found that a 35-degree FOV is pretty much the limit for something approximating ordinary eyeglasses with available technology. Once a design crosses the 35-degree FOV threshold, it starts a design spiral with added complexity. A bigger FOV needs more total light to fill the larger area and more pixels to support with reasonable angular resolution. More pixels mean more processing and data bandwidth. These bigger FOV headsets want 6 DOF tracking and SLAM with more cameras and sensors. Then there is the need for more batteries to power everything, and even bigger issues with dissipating heat with a form factor that lacks a large amount of conductive surface area to radiate heat.

Snap Spectacles 5 Image quality

Very Good Blacks, Maybe Too Good

The S5 has very good black (=transparent) levels. I saw no discernible “picture frame” effect that some associate with AR glasses using LCOS microdisplays. Lines and text are crisp and clear, such that the full resolution of the virtual image can be seen. I did notice a bias toward black/clear. As will be shown in a bit, black lines/pixels are a little smaller than white pixels.

Waveguide Color Shifting Across the FOV

In the center ~50% of the image, the colors look reasonably good. Still, the waveguides exhibit a considerable color shift on the far left and right sides, with the side nearest the entrance grating appearing cyan (lacking red), and the far side has a green tint (lacking red and blue). I doubt Snap is doing anything yet to pre-correct the image for waveguide non-uniformity, but they might in the future. For more on waveguide pre-correction, see: Jade Bird Display’s MicroLED Compensation, which discusses Jade Bird’s Compensation for Waveguide variation as well as their MicroLEDs.

How Good of an Image is Required?

The S5’s overall image quality is much better than most AR headsets I have seen. We also have to consider that in the intended use of putting virtual images in the real world, people are not going to be able to discern much about the absolute image quality. As I often say, “With Optical AR, what you see if the display is off is your black.” Contrast and subtle color differences will be lost against a real-world background.

What will be noticeable with AR glasses is if there is a significant picture frame effect around the display area, which the S5 does not have. The waveguides’ color shift with the S5 is a bit of a concern to me. Typically Green=Good, Red=Bad, and Yellow=Caution. On the S4 waveguides, red is significantly reduced on the two sides, making red hard to see, and will make an intended yellow color appear green.

Through the Waveguide Pictures

First, I want to point out that the S5 has a 46-degree FOV, and I have yet to see a diffractive waveguide that does not have color uniformity issues at that width. Many of the recent products, particularly those aimed at AI glasses with a display, have FOVs of 35 degrees or less. The color uniformity of most diffractive waveguides seems to fall off dramatically past about 25 to 30 degrees.

The pictures below were taken through the S5’s waveguides against a black background. On the left is the view through the left waveguide, in the middle is the right waveguide’s view, and on the right, the left and right were 50/50 merged in Photoshop. I used the same method I used in the Jade Bird compensation article linked to above (see that article for why “fusion” both works and doesn’t work with human vision). Because both the left and right sides of the waveguide are red deficient, the simple fusion does not improve the color uniformity much.

Below is a close-in crop in the center of the image for both white text on a background (left) and black text on a white background (right). Something I immediately noticed with my eyes is that black text on a white background. Something I immediately noticed was that the white text looked bolder/wider than it should have been (I have been using these test patterns for nearly 20 years, so my eye is “trained” on them). If you look at the left (white text on black) image, you will see that the lines are thinner and some of the dots in the letter’ “i” (inside red circles) are missing. On the black on white image (right), the text looks bolder/wider, and there is almost no gap between the black lines. I noticed this issue without having to look at anything blown up (i.e., visible to the naked eye).

Interestingly, I had in my archives pictures I took of the Lumus Maximum in 2021, which was using Compound Photonics LCOS before Snap bought Compound Photonics (see: Exclusive: Lumus Maximus 2K x 2K Per Eye, >3000 Nits, 50° FOV with Through-the-Optics Pictures). Maximus was using a higher resolution LCOS panel but with only a slightly larger FOV, so the pixels in the Maxium are smaller (47 ppd versus the S5’s 37 ppd). The LCOS panel in the Maximus has very little black versus white thickness difference; if anything, the white lines are thicker. I will do a deeper dive into the likely cause of the thicker black lines/pixels in the Appendix: LCOS Lateral Fields and Tailing.

Below, the test pattern has been reversed so that it has black text on a white background. In this case, there is no black backdrop like there was for the so you can see through into the hotel room where this was shot. Note, human vision has a much wider dynamic range than the camera, so the real world would appear brighter to the human eye than this picture shows.

Why the Thinning of White Text May be an Issue

With an AR display, most would advise having light text on a clear background to avoid blocking too much of the real world. With the 37 PPD, the S5 should be able to put up reasonably small but readable text (without blocking too much of the real world) with one-pixel-wide strokes and dots. But with the LCOS’s bias to thinning the white, the small test will be thinner and dimmer, perhaps too dim to be readable.

The LCOS’s bias toward black can also cause dark areas to be slightly darker than they should be, but this is at most a minor issue. It’s hard to compensate for this without analysing the image and adjusting based on the surrounding content. Still, this is, if anything, a very minor issue.

Conclusions

While Meta, Apple, and Google are much bigger than Snap and likely spending much more, AR Glasses are closer to being a core product of the company. Snap seems more focused on building a product and not just R&D concept devices. Snap is also refreshingly more open in working with both developers and the media. They seem to be very developer-friendly. The S5 is part of their evolution to developing what they hope will be a volume consumer device.

The image quality overall is good, particularly the black/clear, but is hurt by the waveguides’ color non-uniformity. I don’t think any optical AR headset is going to be good for watching movies, so, arguably, overall, the image quality is better than it needs to be. As discussed above, as a practical matter, I’m a little concerned about how much the LCOS thins white/bright text, but this is something Snap could adjust in the LC formulation and LC-alignment.

The 46-degree FOV and feature set seem to suggest that the eventual consumer device is going to be more of a headset than glasses. While there is much that Snap could do to slim down the Spectacles 5, I don’t see it being anything near the form factor of ordinary eyeglasses. So it is in a different category than the numerous AI/AR glasses either in or soon to be on the market. Clearly, the Spectacles are not an “all-day wearable” type of product.

The question becomes, how big is the potential market for a Spectacles-like product? In many ways, it seems to be going after the market the Magic Leap One sought in 2018. At least in 2025 and 2026, there seems to be a wave of 35 degrees or less FOV, more stripped-down AR/AI glasses trying to add just a basic display to the existing audio-only AI glasses, such as Meta Ray-Ban Wayfarers.

Appendix: LCOS Lateral Fields and Tailing

Not to make too big a deal out of this, but it appears that Snap is using a significantly different liquid crystal in 2025 than Compound Photonics was using in 2021. Many variables in the LC design could cause the difference in black versus white width, including the LC formula, the LC alignment, and the cell gap between the glass and mirrors.

Shown below is a diagram illustrating how electric field lines go between the ITO (electrically) coated glass cover and individual mirror electrodes forming the pixels. The cell gap (mirror to glass), mirror width, and spacing between mirrors are very roughly to scale with a typical small pixel LCOS device.

The LC will behave best when the electric field is on (white) with straight field lines between the mirror and the glass/ITO, or when there is no field (likely black). But as shown when there is an “On” pixel next to an “Off” pixel, some of the field lines, rather than going to the glass, will find a short path and curve over to the adjacent “Off” pixel. This phenomenon is known as “lateral fields.” The LC does not behave well in the region with the lateral fields. Different LC formulas with different LC alignments will behave differently. In the case of the S5, it appears these areas with the lateral fields will become darkened. Consider that a single pixel dot that is on or off will have 4 pixels adjacent to it that are the opposite, causing these lateral fields.

On the right is a microscopic picture of an LCOS device that I worked on at Syndiant in 2010. This device had a cell gap of about 1.2 microns, the space between mirrors is about 5.4 microns, and the space between mirrors is about 0.4 microns. I expect that Snap’s LCOS has scaled these all down by about the same amount, so most things are in about the same ratio. In this picture, there is black text on a white background, and you can see how the black pixels B and C encroach on the white pixels A and D. How pixels behave is also a function of how the LC is “aligned” relative to the lateral field.

I have marked some of the pixels with the numbers 1 through 6, where you can see a light line in the black pixel. You may notice that these lines are a function of the surrounding pixels where the lateral fields are on more than one side. I think that these are “reverse tilt” lines where the LC is leaning in the wrong direction due to the strong lateral field. There are similar issues with the white pixels surrounded by black, but the lines are not visible. Depending on many factors, these pixels may (or may not) have a hard time returning to their original state when the fields change and get temporarily stuck. This problem is known as “tailing” because when something moves on the screen, it may leave a tail that fades out.

Karl Guttag
Karl Guttag
Articles: 297

8 Comments

  1. In the resolution section you say “The S5 has 1008 horizontal x 1398 vertical visible pixels (yes, wider than tall)” -> should that say “taller than wide”?

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading