Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

[2025-03-09 – Added MITRE’s laser scanning waveguide to the list of laser scanning companies]
With CES and SPIE’s AR/VR/MR conferences, January is always a very busy month for me. This year, I either met with or saw presentations from over 60 companies. I discovered enough to write articles for more than a year. So I have to pick and choose what I find most interesting or what will have a big impact. Many of my articles on a single technology can take me weeks to develop, including studying the technology, taking and editing pictures, and writing the article.
I started to summarize all the companies in one article, but with about 50 to write about, it becomes unwieldy to write even short summaries for each. So I decided to break them into groups. This first group will be the giant companies Google (Android XR) and Meta, both of which participated in AR/VR/MR 2026. Toward the end of this article, I will briefly list the companies I saw in January and hope to discuss in future articles.
One of the reasons I wanted to do the YouTube series “Display Skeptics” with Radu Reit was to have an outlet to discuss new findings more quickly. Our videos are available on Radu’s Display Training Center. We offer free short highlight videos and longer, more in-depth videos available via Patreon for $15 per video or $20/month for all content, including Radu’s Display Training Center teardowns. Radu and I are putting out a news video about once a month, plus at least one long technical discussion. The video format enables me to get information out sooner than I can get to it on this blog, and, believe it or not, often in a more opinionated way.
My partner on Display Skeptics, Radu Reit’s Display Training Center, has started releasing detailed teardown reports, including photographs, component identification, and cost analysis on AR glasses. His first report is on the recent Rokid Green MicroLED glasses (https://lnkd.in/gC7KABbX). Radu has agreed to offer my followers a 10% discount with the code KARL10.

Display Skeptics, Radu Reit and Karl Guttag, will have a on-stage debate & open forum at AWE (June 15-18, 2026) on Thursday, June 18 at 11:20. We will discuss hardware, software, and the overall social implications of AR and AI glasses developments.
I’m hosting a panel and conducting a Master Class at MicroLED and AR/VR Connect Eindhoven, the Netherlands, on September 15-17, 2026. The first day, September 15, is Master Classes and Tours of some Local Labs. The conference and exhibition proper is on September 24-25th. I’m speaking at 5:20 PM on the 24th.
The Conference is offering my readers a €150 discount if they use the discount code KarlARVR. This discount code is valid for both the “Virtual Pass” and “Hybrid Pass” for virtual and/or physical conferences, which can be attended remotely (this blog receives remuneration for the use of this code).

At AR/VR/MR 2026, Hugo Swart, Android XR Ecosystem Lead, presented “Android XR: the future is bright!” He began by discussing Google’s Gemini AI assistant, then covered Google’s XR collaboration with Samsung and Qualcomm. They also mentioned that they are working with sensor, display, battery, chip, camera, and memory companies, without naming any. Swart called Samsung’s Galaxy XR “the powerhouse of our ecosystem.” He asked who had tried Galaxy XR to raise their hands, said, “Only a few?” and then implored the audience to try it (live questions are always risky). In function, the Galaxy XR is a VR headset with camera passthrough and is functionally very similar to the Apple Vision Pro (AVP – see my many articles on the AVP) in both look and function, but at about half the price at $1,799 (and without the AVP’s silly googly eyes on the front).

Swarts stated the same Android XR software will work across a range of products from XR (camera passthrough) Headsets, XR wired Glasses (ex., Xreal Aura), wireless XR glasses (similar to Meta Ray Band Display but in development by Google and partners), and AI (audio and cameras but no display) Glasses.

He then briefly discussed AI (Google’s Gemini) glasses, with and without a display. Quoting Swart (with my bold emphasis), “At the other end of the spectrum, we have AI glasses designed for all-day wear. And we are working on two distinct types of glasses. One is an AI glasses that have cameras, microphones, and speakers. Also working with display AI glasses, which have a smaller FOV, waveguide, and are often monocular, for glanceable content.” In short, it sounds like Google and its partners (particularly Samsung) will be doing the same thing Meta has been doing with its Ray-Ban AI glasses and Meta Ray Band Displays. Samsung has confirmed that it is working on AI (no-display) Glasses that will appear in 2026, but has not confirmed whether there will be a version with a display. My understanding is that the Raxium MicroLED-based AI Display glasses shown at Google I/O (see: Google XR Glasses Using Google’s Raxium MicroLEDs While Waveguide Lab Sold to Vuzix) are still just an R&D effort and many years away from being the basis of a product.
Swart spent nearly half the presentation time discussing the joint development of Wired XR Glasses and Project Aura with Xreal and its associated optics, which surprised me. In this configuration, the glasses have microphones, speakers, IMU, and visual and hand-tracking cameras, with a separate computer or a special Android XR compute-puck (essentially a smartphone without a display) providing all the processing, display generation, communications, and power/battery. This general concept has been around for several years, with Sightful’s Spacetop using Xreal glasses with both a custom display-less model and, later, a conventional laptop via USB-C, and the XREAL Beam Pro compute puck (which runs Android with a built-in display that is essentially a phone-less smartphone) has been available since 2024. Apparently, the Aura compute puck will have more processing power and be better integrated with Google Gemini, but without its own display, while having more processing power and more battery available for the glasses.

Swart discussed the Aura being Optical See Through (OST), but it should be noted that the Aura will be about 20% transparent/80% light-blocking at its most transparent, with an LCD-based global dimmer. In terms of the optics (there may be different sensors, cameras, mics, and speakers), the Glasses are slightly improved, if any, compared to the Xreal One Pro (for how the optics work, see: Xreal One Pro Optics and Its Connections to Ant-Reality and Google).
Swart spent considerable time on its Compound LightGuide (CLG). Swart said, “We came up with the idea of a hybrid optical system,” (aka Compound LightGuide). In fact, the optical design came from Google’s 2024 acquisition of Ant Reality (aka AntVR).


Swarts described the CLG as a cross between birdbath and pancake optics. I think of it more as a variation on birdbath optics that uses solid optics with a total-internal-reflection bounce to make the optics thinner than a typical birdbath, all of them use a combination of polarizing mirrors, quarter waveplates, and non-polarizing curved semi-mirrors. It appears the relationship between Google and Xreal extends to sharing optics, even if they have different names for it.
Interestingly, this is the same optical design that Xreal introduced in their Pro 1 as a “Flat Prism” (see: Xreal One Pro Optics and Its Connections to Ant-Reality and Google, and the figure below. I assume there is some undisclosed licensing deal between Google and Xreal to share the optics

Whatever the design is called, its pros and cons relative to a waveguide include:
I view the CLG as a much-improved birdbath, and not a competitor to an all-day wearable glasses form factor (at least for a mass audience). The thickness of the optics means that while from the front it may look like glasses, it sticks out about 15mm further than a waveguide-based design.
As important to me, both birdbath and CLG designs inherently block more than 75% of real-world light, and are like wearing moderately dark sunglasses. Waveguide designs, by contrast, typically block 10-20% of real-world light. While CLG-type optics might be important to a smaller company, I don’t understand why Google made such a big deal about it. It is not the type of technology that should interest Google, given the size of the potential market, with the CLG issues not being in the multi-tens of millions. It might be a nice-sized market for some of Google’s partners, such as Xreal.
Xreal’s birdbath and “flat prism” designs, I would tend to categorize as portable VR, where the “see-through” supports the VR, letting the user see their surroundings, albeit somewhat dimly. These glasses are often used with handheld video game systems (e.g., Steam Deck) and for portable movie watching. My point here is that it is a very different class of XR than, say, smart glasses.
Swarts also showed a “Dual Engine CLG,” which is essentially Ant Reality’s Crossfire, as shown at AWE 2022. I discussed this concept in a video with Brad Lynch back in June 2022.


What puzzled me was why Google was making a big deal about it at this time. It was kind of like, “well we have to talk about something technical and not just marketing of Android AR partners.”
While I think Google talked too much about CLG optics, Google had nothing to say about their MicroLED developments based on their 2022 acquisition of Raxium (see: Google XR Glasses Using Google’s Raxium MicroLEDs While Waveguide Lab Sold to Vuzix). Raxium’s MicroLEDs were reportedly used in its XR Glasses, which were shown at Google I/O 2025. Additionally, there are rumors that Google is working with Magic Leap on diffractive waveguide development. I don’t understand Google’s scattered technology acquisitions, including Raxium MicroLEDs and Ant Reality, while at the same time partnering with system companies like Samsung and Xreal.

Before I get to Jason Hartlove’s AR/VR/MR 2026 presentation, I want to show a slide he presented a couple of months earlier at the Bay Area SID in November 2025. This is almost the same chart I discussed in May 2025 (see: Meta’s and Google’s Roadmaps for LCOS versus MicroLED or LBS), but the new chart now includes possible dates on the time axis; most notably, it shows MicroLED becomes mainstream in about 2034, and laser display (Laser Beam Steering or Laser Illuminated LCOS) starting in about 2034 and not becoming mainstream until about 2043. While green-only MicroLED AR glasses are available today, full-color MicroLEDs always seem to be just a few more years away. Laser displays are pushed out into the distant future, with them starting maybe about 8 years from now and maturing over nearly 18 years (and out of most people’s scope of care — call me skeptical). I’m planning a panel discussion of the pros and cons of the various display technologies (LCOS, MicroLED, and Laser) at MicroLED and ARVR Connect 2026 in September at Eindhoven.
According to Hartlove, Meta Ray Ban Display (MRBD) “exceeded sales [expectations] by over 300%.” He mentioned the cost and battery life issues with MRBD that need to be addressed in future products.
Hartlove’s AR/VR/MR 2026 presentation discussed how Meta Ray Ban AI glasses (camera and audio with no display) have sold millions of units (reportedly about 7 million) and how this is setting the stage for AR/AI glasses with a display. He went on to discuss how AI is enabling applications and that a display would improve the capabilities.


Next, he discussed the challenges of bringing display glasses to mass production, primarily the cost, which in part is driven by the complexity of the optical stack/eyepiece, what Hartlove calls “The Impossible Sandwich.”


The main thrust of his talk was what it would take to reach 100 million units a year, or less than 10% of the roughly 1.25 billion smartphones sold annually. Hartlove worked through the rough numbers to determine the glass wafer capacity needed per year for waveguides, and it came to 6 times that of the semiconductor wafers processed by TSMC (below left) if smart glasses reach smartphone-type volumes. While waveguides are nowhere near as complex and difficult to produce as semiconductor devices, it still give an idea of the task ahead.
He then discussed market elasticity versus price, using prescription (RX) models as a guide (below right) based on Meta’s experience with non-display AI glasses.


Meta’s models suggested a premium of $325 ± 80 over normal Rx glasses, or $200 over non-display AI glasses; thus, leading to a display (optics, display device, and associated electronics) of $100/eye. These numbers appear to be the price and not the cost premium. Worse yet, the $100 has to be split between the waveguide, display optics, display device, the electronics, and the various optics in The Impossible Sandwich (above). There is not much money for each component that would justify the R&D investment.
There appears to be the implied assumption that the price premium will be added at cost, such that any profit will have to come from “secondary revenue streams”, selling software, services, and data. Maybe I missed something, but I am trying to make sense of the numbers presented and what has to be done.
This led him to conclude that reaching 100 million units would require the display (display, waveguide, and electronics) to drop to $100 per eye, or what he dubbed 100/4/100 (100MU for $100)


I disagree with the premise of the “100\4\100” concept, as it puts the completely wrong emphasis on what is holding back AR glasses. For new high-tech products, usefulness is vastly more important than cost. If something is useful, there are people who will buy it at a high initial cost, and as volume increases, costs come down (classic technology learning curve). If cost it put as the primary driver, then you tend to get useless junk.
As I have stated many times, including in Apple Vision Pro (Part 1) – What Apple Got Right Compared to The Meta Quest Pro, I disagree with people saying that the Vision Pro costs too much. The problem with the Vision Pro is that it didn’t do what people needed. The Apple II, released in 1977 with a base price of $1,298, would cost over $5,000 in today’s money. At the time of the first iPhone in 2007, most cell phone makers focused on making smaller, less expensive phones to grow the market, and a phone had to easily fit in a small shirt pocket; it turned out that people found the iPhone to be so useful that they would spend more and accept a larger device. You are not going to make a great new technological product by chirping like a bird (cheap, cheap, cheap).

Hartlove then proposed the display capabilities at $100/eye. The minimum requirement is pretty close to the Meta Ray Band Displays (other than the cost).
Supporting a +/-4 diopter (with no astigmatism?) seems very limiting in the basic requirement. In the “premium” category, full Rx lenses are supported, but progressive lenses are not supported except as a “Future Trend.” Presbyopia (nearsightedness) starts for most people at age 40. I know from my own experience that it is a mess to wear AR glasses without progressive lenses while wearing contacts. It becomes a bit of a numbers game, with prescription issues cutting into the percentage of people who can use AR glasses. I hate to think that progressive lenses are as far off as solving VAC (vergence-accommodation conflict) or Holography.
The “Ambient Contrast Ratio” (display nits versus ambient-light reflection nits) seems ambiguous: what constitutes ambient, and does this include dimming? What, not just say brightness/nits? I also don’t think it makes much difference whether the color space is 85% or 124 of Rec 709a when you are talking contrast ratios of less than 10:1.

Hartlove closed with a pitch for the AR Alliance. STMicroelectronics originally founded what has become the AR Alliance, then known as the LaSAR Alliance, in 2020 to promote laser-scanning AR. In 2024, the Alliance was broadened to include all of AR and rebranded as the AR Alliance, with Meta becoming a founding board member. In November 2025, the AR Alliance became part of the SPIE. Barry Silverstein, formerly with Meta and now Director of the Center for eXtended Reality (CXR) at the University of Rochester, is a laser display for AR proponent at Meta held a session on the laser display working group within the AR Alliance.
Oguzhan Avci, an Optical Scientist at Meta, presented a PIC (Photonic Integrated Circuit) for the generation and routing of multiple sets of RGB laser beams on a chip. He also showed integrating the laser, combining PIC with a PZT (lead-zirconate-titanate, a type of piezoelectric) cantilever 1-D scanner. Unfortunately, there was no information about the system in this presentation. He did present that they had a working prototype with a resonant frequency between 18 and 36 kHz.


While he didn’t give a system example, this device would seem to be related to the “Coherent Lissajous Scanning” concept presented by Meta in “Advances in MEMS-based laser scanning displays for AR glasses” at AR/VR/MR 2025 (see figures below). The concept uses multiple laser sets to sweep many parallel RGB laser sets, supporting higher resolution without impossibly high horizontal sweep rates. Based on my observation of the progress in laser scanning displays since 1998, I’m skeptical about the ability to correct for all inevitable uniformity problems when multiple lasers are swept either with raster or Lissajous scanning. I’m still doubtful it will be practical by 2043 (i.e., on Hartlove’s chart), but anything could happen in the next 17 years. I would prefer to see technology that might happen in 5 years.


I wanted to mention most of the over 60 companies I either saw present or met with in January 2026. I’m grouping them by category, with a very brief description of what they presented. I plan to follow up on articles about many of them in the coming weeks.
Headsets and Optics – Avegant (small LCOS engines and reference designs), Cybersight (green MicroLED with waveguide), Engo (small form factor OLED glasses), Even Realities (green MicroLED, lightweight AR/AI glasses, Everysight (small form factor, OLED glasses), Goeroptics/Goertek (optical engines, diffractive waveguides, Silicon Carbide Waveguides, and headset manufacturing), Innovega (glasses and contacts for the vision impaired), Lynx (camera passthrough XR), Rokid (green MicroLED with waveguide), INMO (Air3 with 1-Reflective waveguides and OLED), TCL (full color X-Cube MicroLED with waveguide), Tozo (OLED with birdbath), Vuzix (headsets of various sizes for consumer, industrial, and military applications, also a supplier of diffractive waveguides), Xreal (OLED with Birdbath and Flat Prism and association with Android XR).
Dimming (non-polarized) – Flexenable (Guest-Host LC, >0.1 second switching, biaxially curved), Goeroptics (electrochromic, >2 second switching)
Innovative Technology – A-Star (1 Micron LCOS with metamaterial nano-antennas), Gixel (mini-mirrors for steerable eye box), Kaist (see-through micro-OLED), Smith Display (laser with a non-linear waveguide).
Laser Display Technology – Appotronics (Lissajous scanning LBS engine), Brilliance (laser combining and drive integration), Indi EXALOS Lasers, Meta (On-chip laser beam scanner based on SiN PIC integrated with PZT MEMS cantilever for AR), Nyanyang Technical University (Fiber Scanning Display – seems like a repeat of Magic Leap’s FSD – see: Magic Leap Fiber Scanning Display (FSD) – “The Big Con” at the “Core”), Sony (laser array and mini-laser), VitreaLab (laser routing guide or PIC for LCOS illumination), MITRE (a single or array of scanning laser waveguides for high resolution in a small form factor).
LCOS – Creal (FLCOS, laser illumination, and time sequential pixel replication), Kopin (small pixel FLCOS), Raontech (LCOS, MicroLED, and OLED – Working with Int Tech).
OLED – Int Tech (first 100K OLED MicroDisplay – uses RaonTech backplane)
MicroLED – Mojovision (MicroLED with Quantum Dot full color conversion), Plessey (native color MicroLEDs and quantum dot color conversion), Porotech (MicroLED with porous GaN)
Prescription Lenses – Addoptics (resin plastic lenses from rubber molds and supporting electronics integration), Tobii (glass molds for resin plastics and supporting electronics integration)
Reflective Waveguides – Lumus (70-degree FOV, lower cost, and higher efficiency 30-degree FOV), Envision Photonics (1-D reflective waveguide with OLED microdisplays, 2-D reflective waveguide with ~30-degree FOVs, 2-D hybrid reflective and diffractive waveguide, and 60-degree diffractive waveguide), INMO (1-D reflective waveguides), Optinvent (2-D reflective waveguides in molded plastic), Schott (reflective waveguide glass and manufacturing and diffractive waveguide glass wafers).
Diffractive Waveguides – Appotronics (diffractive waveguides including dual waveguides from a single LOCS projector), (Goeroptics (Silicon Carbide and glass diffractive waveguides), Envision Photonics (1-D reflective waveguide with OLED microdisplays, 2-D reflective waveguide with ~30-degree FOVs, 2-D hybrid reflective and diffractive waveguide, and 60-degree diffractive waveguide), Dispelix (diffractive waveguide designs with various FOVs and characteristics), Mistu Chemical (plastic wafers, up to 12″, for diffractive waveguides), Mojie (plastic waveguides), Morphonics (large scale diffractive waveguide manufacturing), Magic Leap (diffractive waveguide design and manufacturing).
Obviously, something cannot be so expensive that almost no one can afford it, but as stated above, I think Meta’s 100/4/100 concept is flawed and could steer technology in the wrong direction, at least with the current state of development. I would like to see something that could be built for say $2,000 or even $3,000 that can demonstrate a highly useful product; then let volume and technology do their thing to reduce cost. Historically, for example, the automobile, computer, and cell phone, democratization of technology comes from creating something new and useful that only a few can afford, then letting volume and technological improvements drive the cost down.
I don’t understand why Google/Android XR spent nearly half the presentation on wired XR and their Compound Light Guide. This is not a big enough market to excite Google. It may be a way to get started in the space, but this is more what I expect a startup to be talking about than what I expect from a company the size of Google.
As someone who has worked for both large and small companies, I find both Google’s and Meta’s approach to AR baffling. Today, we see either feeble AR devices trying to meet a “consumer cost point” or lab projects that, if all R&D expenses were included, would cost many tens if not hundreds of thousands of dollars (e.g., Meta Orion and the Raxium MicroLED glasses at Google I/O).
There is still a lot of hand-waving about AI being the solution that will drive the AR market. The AI applications I see demonstrated seem trivial or narrowly focused. While things like translation and teleprompters are given as examples, they are not necessary for most people. If someone is going to wear these AR glasses every day, then they need to do applications that many people will want every day. I would like to see big companies like Google and Meta show more than the same old applications everyone else is showing and demonstrate how AR/AI could be truly compelling and why it might be worth more than $100 per eye.
No appearance of anything LetinAR related?
I have visited and seen LetinAR many times throughout the years, but I didn’t see them at CES or AR/VR/MR this year. Looking back at the CES website, I see LetinAR had a private suite, but I didn’t know about it at the time of the CES. I don’t think they had a booth or presented at AR/VR/MR.
Swart’s presentation wasn’t ever going to make any business sense at Google’s scale. He obviously lost some political maneuver and got shunted off into the hinterlands of a yesterday’s news division, orders of magnitude below any of Google’s profit centers. His presentation could have been titled “A reminder of the exciting acquisition we made, five mm off the front is going to totally make XReal mass market this time, please please please don’t just cut my whole division and throw it all in an Indiana Jones crate.” Monopolists gonna monopolize, pay some pocket change to buy out anything that looks like ten people might use it instead of your rent garden.