Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124


As I often say, few will volunteer information, but many will correct you. This is the case with my supposition about Google’s XR glasses prototype. In “Meta Hypernova and Google AR/AI Glasses – Lumus & Avegant Inside, Both Using LCOS MicroDisplays,” I mistakenly concluded that Google’s new AR glasses were using Avegant’s LCOS engines, based on a comparison between the Avegant and Google AR glasses prototypes and the information that both were using Applied Materials waveguides. It turns out that while Google “evaluated” Avegant’s reference design, they designed in a Raxium monolithic full-color MicroLED device and projection optics.
The use of a monolithic full-color MicroLED indicates to me that this is just another R&D demo like Meta’s Orion. From what I’ve heard, combined with my knowledge of the state of MicroLEDs, the Raxium MicroLED’s yields are very low, possibly less than 1%, making it infeasible to turn the Raxium-based XR glasses into a product. As this article will discuss, it does not appear that Raxium was very far along in its MicroLED development when Google acquired it.
Since I was surprised to find out that Raxium’s LEDs are inside Google XR glasses, I decided to look more closely into Raxium. Raxium was a bit of an Enigma to me. It seemed strange that Raxium, later known as a MicroLED company, had Gordon Wetzstein, a professor at Stanford and best known for his work in light field and similar near-eye displays (see here, for example), as a co-founder and Chief Scientist for several years. Also, shortly before Google acquired Raxium, Michael Klug, who was VP of Advanced Photonics at Magic Leap and previously founded the light field company Zebra Imaging, joined Raxium. I was left wondering why a MicroLED startup had prominent individuals with experience in light fields and optics. So, I looked at Raxium’s patent filings to get some answers.
Although I was mistaken about Google’s XR glasses using Avegant’s LCOS engine, I received further confirmation that Meta is likely utilizing Lumus’s waveguide and LCOS engine in their Hypernova glasses.
Shahram Izadi, VP of Android XR, in his TED talk, held up a device he described as a “Full color display” without saying what technology it was using. I went back and forth with some people before writing the article as to whether it was a MicroLED or LCOS. The accompanying pre-recorded video (still frame below left) implied that it was showing the device. The entire setup looked nothing like a display device, which is typically 10-20% larger than the image produced. The inset image looked to my eye more like something that had been added in post-processing. The inset image didn’t even appear to have been produced by a MicroLED display. Normally, MicroLEDs produce a much more contrasty and color-saturated image. Additionally, it bore no resemblance to what Izadi was holding up. The video made me suspicious of everything else that was being said (what was real information and what was fudged).

Another thing that clouded my perception is that I was thinking, or at least hopeful, that it would be a product, perhaps not sold by Google, but possibly one of Android XR’s partners. I’m always thinking in terms of what I would do to create a product, not just a lab demo.
This thinking pattern caught me off guard when, on a Reddit Forum, I commented that HoloLens 2 would use LCOS and not laser beam scanning (LBS). I simply could not believe that Microsoft would be so dumb as to use LBS in an AR product. But I underestimated the depth of corporate politics and the fact that it was still an R&D project that escaped the lab.
Monolithic full-color MicroLEDs are not ready for a production product. The word is that Raxium MicroLEDs could have yields of less than 1%, and even that is likely with making generous assumptions about what constitutes a “good” device (i.e., how many dead pixels/subpixels are allowed). Raxium was founded in 2017, and its original focus was not on creating microdisplays for AR, but rather on direct-view light field displays, such as Looking Glass (more on this later).
It appears that Google’s XR glasses are just as much a lab demo as Meta’s Orion. The fact that the Google XR demo glasses were monocular and there were only a few such glasses might give some idea of the yields of the MicroLED.
It’s interesting to contrast Google’s XR glasses with Meta’s Orion lab demo and its Hypernova product. Meta’s Orion lab demo used their internally developed Silicon Carbide Diffractive waveguides with (perhaps a customized version) Jade Bird Display’s MicroLEDs. However, when going to market with a product, Meta’s Hypernova utilizes Lumus reflective waveguides and an LCOS microdisplay.
I expect that if something similar to Google XR glasses does go to market, perhaps with one of their partners, they will switch to using LCOS, use green-only MicroLED, or perhaps a three-chip (RGB) MicroLED design with either an X-Cube (such as TCL full color MicroLED glasses) or the diffractive waveguide acting as the combiner.
It might be worthwhile to level set on the state of native (non-QD) full-color monolithic LEDs. To manufacture a native full-color MicroLED display device, each color of LED has a different crystal structure. The various crystals must grow in separate processes, and each of these steps can have a negative impact on the other colors. Then there are the issues of mutual self-heating and thermal expansion of the different crystals. Then, to get the same resolution as a single-color MicroLED, they must yield three times the number of LEDs.
Non-Micro LEDs are manufactured on a wafer and then singulated, tested, and graded. However, with a MicroLED microdisplay, you receive all the LEDs, including both the good and the bad ones, as a single set. The size of the MicroLEDs, typically on the order of 2 to 6 microns, is too small to be singulated, tested, and placed to form a display array; therefore, a whole display of LEDs is typically flip-chipped onto a CMOS control backplane. Each of the LEDs has at least two contacts that must be made perfectly to the CMOS device.
The LED wafer processing and CMOS wafers are made in different fabs, often by different companies. There can be process compatibility problems between the LED and CMOS substrates. Then there is the assembly process, which is unique to each MicroLED assembler. Typically, companies that manufacture LEDs are not skilled in CMOS design. Rarely does one company handle everything, and one or more parts of the process are outsourced to third parties. It is significantly more complex than designing a typical chip startup, which involves designing a CMOS chip using widely available design tools and outsourcing to a well-established CMOS fab. Many steps in this process are still in the research and development (R&D) stage and involve coordination between different companies.
While native full-color MicroLEDs are not impossible, numerous companies have invested substantial amounts of money in R&D facilities, factories, and people that have yet to produce them in any quantity. It’s hard to believe that Raxium was that far along before Google acquired them. And while Google has lots of money (and so do many other companies that have tried, including Meta), as the saying goes, it still takes nine months to make a baby.
I thought it would be instructive at this point to show the Innovation MicroLED display, the best monolithic full-color native MicroLED display I have seen to date. There may be a better full-color native MicroLED in a lab somewhere (perhaps at Google), but these are out in the open, where I can take pictures.
While at SID Display Week 2025, I was allowed to see and take direct microscopic pictures of their monolithic, 1024×768 full-color display with native (non-QD) stacked MicroLEDs. Note that, even with the magnifying glasses they provided, you couldn’t see the details in the display.
I used a 5x Macro lens connected to a 45 megapixel camera, which could capture individual pixels on the display device. Below (left) is Innovision’s booth with their demo, and to the right is a full image captured by my camera (click on the image to view a larger version).


It was hard to tell much from the images Innovation was showing (demo images often hide any flaws), and they were nice enough to output one of my test images. Unfortunately, I didn’t have 1024×768 test patterns, so they were scaled down when displayed. Below is the direct micro-picture of the Innovation display (left) and the source scaled to the same size (right).

Below, I have zoomed in a bit more (left) and then included a low-resolution whole image (right), which is about what you would see if you looked through a simple magnifying glass at the whole display. If you look in the white circle with the “23,” you will see many dead or very weak subpixels and many red spots (perhaps stuck on) in the face of the elf.

I do caution that even this device was likely a “cherry-picked” device for showcasing at the expo, and the display has thousands of what a consumer would consider bad pixels. As it is, it might be useful for an “enterprise” application, but a consumer would likely not accept it when compared to other display technologies today.
Something else I learned while working on this story is that Google was the “global technology leader known for its extensive work in software” that sold its waveguide R&D facility to Vuzix (see: Waveguide R&D Facility in Silicon Valley to Strengthen Partnerships with Big Tech OEMs/ODMs). I’m not sure why the news release was vague about who it was, given that it was bound to leak out (how many Waveguide R&D facilities can there be in Milpitas, run by a global software company?).
Interestingly, Google appears to be developing MicroLEDs in-house with Raxium. At the same time, Google is closing down its in-house waveguide R&D. Meta, in contrast, continues to conduct massive in-house R&D on waveguides, particularly its Silicon Carbide waveguides. However, there are also substantive rumors that much of Meta’s optical work has been outsourced to foreign companies.
When I first dug into Raxium, their patent applications with integrated MicroLED and optics reminded me of Meta (then Facebook’s Oculus) acquisition of InifiniLED in 2016. InifiLED was developing an integrated mirror and optics with a MicroLED. Below (left) is from a patent application, and below (right) is an electron microscope image.

Nothing more was heard from InifiLED or Meta/Oculus since the acquisition. Meta went on to acquire the rights to all of Plessey’s MicroLED production back in 2020, but there are many reports that that relationship has also not worked out. Meta’s latest Orion Glasses, which I have from multiple sources, used MicroLEDs manufactured by Jade Bird Display (JBD).

I suspect that Meta has not given up on developing its MicroLED capability and has instead moved on to exploring other companies. However, for the last two years, Meta has been showing the slide below (see: Meta’s and Google’s Roadmaps for LCOS versus MicroLED or LBS), which suggests that MicroLEDs might be just a step on the way to lasers (could be scanning, laser-illuminated LCOS, or something else).
I have a saying, “That in AR, if you can dream of it, Meta has already tried it.” With Meta losing about $20 billion/year, just about anything is possible. The R&D “churn” of concepts and personnel going in and out of favor is also very high.
As stated in the introduction, I was perplexed as to why a “MicroLED company” such as Raxium had Gordon Wetzstein, a co-founder and chief scientist with no history in MicroLEDs, but rather more famous for his work in light field displays.
A quick search of Raxium’s patent applications reveals that many of their 27 applications are related to creating direct-view (monitor/TV size) light field displays. MicroLEDs would have an advantage in this application as the emission area is small, supporting the micro-optics that would be used to form the light field.
Raxium’s earliest patent filings talk of a Raxel, their name for the pixel element of their light field device. The original patent applications suggested that the MicroLEDs could be native/inorganic or color-converted (such as Quantum Dot conversion of blue).
Quoting from Raximum patent applications 20190333443 and 20210383746 (mostly the same spec) with my bold emphasis:
In this disclosure, the term “picture element ” and the term “ super-raxel” can be used interchangeably to describe similar structural unit in a light field display. In some instances, a “picture element ” can be referred to as a pixel, but it is different from a pixel used in traditional displays.
Below is a collection of figures common to several of Raxium’s earlier patent filings. Note that a large direct-view light field display appears to be their original target end product.

The figures below, also common between 20190333443 and 20210383746, indicate that Raxium may have originally considered either native or QD-converted color LEDs. It appears that the company was initially more concerned about light field MicroOptics than the structure of the MicroLEDs. This helps explain Wetzstein’s (with his light field background) involvement with the company.

The evidence suggests that Raxium is a light field company that initially used LEDs as its display, but later pivoted to become a MicroLED company. This reminds me a lot of Mojovision, which started as a display-in-contact-lens company and, after burning through over $200 million, reorganized and pivoted into a MicroLED company.
Pivoting for startups is not necessarily a bad thing. Still, there was not a lot of time between Raxium being founded and then seemingly pivoting to a MicroLED company and being acquired by Google.
It’s also interesting to note that Google has been working on large direct-view Light Field displays for many years, with their Project Starline (first demoed in 2021), which was recently renamed to Google Beam. It is possible (although I have no information) that Raxium may have first come to Google’s attention due to Raxium’s work in light fields. However, before Google acquired Raxium, Raxium was already considered to be a “MicroLED Display” company.
Early on, Raxium seemed to treat MicroLEDs as a means to an end (light fields) as opposed to having unique MicroLED technology. It was only in provisional applications filed in 2020 (such as application 20220352253 below), approximately two years before Google acquired Raxium, that a few patents were filed that give some detail on the structure of full-color monolithic MicroLEDs, primarily involving one person, Gang He, the CTO of Raxium. My point here is that there is no indication of major MicroLED development at Raxium prior to the Google acquisition.

The provisional application that led to the US202339575 filing came soon before Google’s acquisition. By this point, Raxium seems to have fully pivoted to be a MicroLED company rather than a light field company.


There is still a Raxium.Google’s website appears to be dedicated solely to job openings in the area of MicroLED manufacturing.
This website also has a curious artist rendering of a MicroLED with various-sized color subpixels. The arrangement of the subpixels is ridiculous and looks to have been split in half and mirrored. It shows simple microlenses (and not the optics for light fields). The page also has a view of different-sized red, green, and blue MicroLEDs, but this could also be an artist rendering. This type of “artist rendering” is why I was skeptical that the device shown in the TED talk’s pre-recorded video was the actual device. I so often see ridiculous/fake marketing pictures and videos.

While reviewing recent Google XR glasses patent applications, I came across US 20250172756. The structure reminds me of Meta’s Orion diffraction gratings discussed in Meta Orion AR Glasses (Pt. 1 Waveguides). As discussed in the article, the waveguide structure shown appears to be very similar to that of Waveoptics, which was acquired by Snap in 2021.
During the review of Google’s XR-related patent filings, which include MicroLEDs and waveguides, I encountered US20250147317, “Pixel Shifting a Time-Division Multiplexed Projection Display.” Interestingly, this patent includes Michael Klug (former Magic Leap) and Zheng Qin (former founder and CEO of AntReality, which Google acquired) as inventors.
Before I take you down this rabbit hole of a patent application, I want to note that I believe time-multiplexed pixel shifting could be very useful with MicroLED microdisplays. It is just that this application seems impractically large and complex, and would likely have poor image quality. It has numerous additional optical surfaces that could cause ghosting and other issues, and the added prisms will result in longer path lengths that will likely cause optical problems. This application may be an indication of what researchers at Google do with their time. It’s also possible that someone decided to file a patent just in case. Nevertheless, this is unlikely to be what Google XR will ever use.
In January 2023, on this blog, approximately 10 months before Google’s first provision related to this application was filed, I discussed using timed division multiplexing and pixel shifting as a means to alleviate dead pixels, improve uniformity, and enhance resolution with MicroLEDs. I also discussed this concept some more in March of 2023 in Cambridge Mechatronics and poLight Optics Micromovement (CES/PW Pt. 6), including the following list of uses:
- Shifting several LEDs to the same location to average their brightness and correct for any dead or weak pixels should greatly improve yields.
- Shifting spatial color subpixels (red, green, and blue) to the same location for a full-color pixel. This would be a way to reduce the effective size of a pixel and “cheat” the etendue issue caused by a larger spatial color pixel.
- Improve resolution as the MicroLED emission area is typically much smaller than the pitch between pixels. There might be no overlap when switching and thus give the full resolution advantage. This technique could provide even fewer pixels with fewer connections, but there will be a tradeoff in maximum brightness that can be achieved.
In my blog article, I assumed some form of electromechanical movement of the display device or optics. poLight moves/squeezes the optics, whereas Cambridge Mechatronics could move the display or optics.
This Google application employs an all-electronic approach utilizing liquid crystals, quarter-wave plates, and a birefringent element (“pixel shifter”) that shifts the pixels based on polarization. As shown in the figure below, a beam splitter is used to split the image into two paths based on polarization for “shifting” and then combines them back together to keep all the light.

The example above assumes a diffractive waveguide. Fig. 15 from the application shows how it would work with a reflective/geometric (ex., Lumus) waveguide.

The prior examples assume either a single color or a monolithic full-color MicroLED display. The application gives several examples (two of which are shown below) on how the optics expand to support separate R, G, and B MicroLEDs, combined with an X-Cube. I’m not sure about you, but this seems like a lot of optics to cram into a headset 😁.


The application also demonstrates how to apply this concept with AntReality-type optics. As discussed in XReal One Pro Optics and Its Connections to Ant-Reality and Google, Google acquired AntReality (also known as AntVR) in 2023.
Note that all the optics above are just to support a one-dimensional shift. The patent outlines the concept for a two-dimensional shift but does not show what the optics would look like. However, I envision it could more than double all the shift paths.
In a subject related to time-division pixel multiplexing, around 2019, Kura began discussing its Galium design. I was going to write about it at the time, but I didn’t see any way they could pull it off, and I got busy with other matters.

Kura was discussing scanning an LED array that was the full horizontal resolution but only “N” rows of LEDs. Kura’s plan (they never made it to a product as far as I am aware), as evidenced by their US 2021/0225268 patent application, was to use separate arrays for Red, Green, and Blue LEDs. The application provides an example of an 8,000 x 50-pixel array.
While not shown in the patent application diagrams from Kura below, Kura presented figures in 2019 (with my annotations in red) that suggested an alternative arrangement, where instead of linear sets of each color, they would use sets of rectangular patches (right). In both cases, they would still be one-dimensional scanning.

The advantage of this approach is that only a relatively small number of physical pixels are required to generate a much higher-resolution image, and the brightness variation of individual LEDs can be averaged out. However, the light output is then divided by roughly the number of physical rows versus the number of rows in the image, and thus would likely not be bright enough for, say, waveguides. Additionally, assuming the scanning mirror moves linearly, there will tend to be a blending/blurring between adjacent rows.
Raxium-based Google XR glasses appear to be an R&D demo with little chance of production in any rational timeframe, much like Meta’s Orion, which is based on the current state of monolithic, full-color MicroLEDs. In short, it is merely a lab project attempting to resemble a consumer device. However, Applied Materials, which announced its collaboration with Google in AR in 2024, has also announced that it is working with Avegant at about the same time. Applied Materials is not in the waveguide business for Google’s R&D volumes.
I have seen Avegant’s reference design, and the image quality is reasonably good compared to other diffractive waveguide designs I have reviewed. The size and weight also appear to be acceptable. Unfortunately, I have not had a chance to study the glasses carefully. The biggest obvious drawback is that the Applied Materials image quality is good for a diffractive waveguide (there are still some color issues across the image), the front projection (“eye glow”) is much worse than most. I have a hard time believing consumers will find this acceptable, but diffractive waveguides can be designed where eye glow is not such a problem. Avegant is reporting that they have multiple customers for both their 20-degree monocular and 30-degree biocular reference designs, which utilize Applied Materials waveguides.
I think we are likely to see Android XR glasses on the market that resemble the ones Google showcased at Google I/O within the next year, but they will likely use LCOS microdisplays if they are going to be full-color.
Karl, you have out did yourself with this one! Super informative!!
“Note that all the optics above are just to support a one-dimensional shift. ” not to mention optically clear birefringent medium is either on the order of 1.5-3mm itself or its immensely problematic polymer (haze or lc alignment) the shift is gradual vs AOI so it’s no wonder contemporary VR lacks this
You make a good point about the possible issues with the birefringent material. Of course, whole thing is such an optical mess that it makes you wonder why the filed on it. The most interesting things to me was that Michael Klug (former Magic Leap) and Zheng Qin (AntReality/AntVR) were on the application as inventors.
That was very informative…especially cracking the Vuzix Milpitas case
Thanks
Did Vuzix buy the Miltipas facility directly from Google or Via AMAT?
The announcement from Vuzix stated “The facility, previously operated by a global technology leader known for its extensive work in software, AI, and augmented reality” which sounds like they bought it from Google.
I hear anything about AMAT being involved at all. Do you have information that AMAT was involved with that Fab?
Not at all. Just the rumor that AMAT was supplying WG to Google.
Low waveguide efficiency dictates high power throughput from every pixel in the panel. Many pixels cannot achieve this pass/fail criteria resulting in low panel yield. However, if the waveguide has high efficiency, then the pass/fail criteria is set lower, more pixels ‘pass’ thereby high panel yield.
Very curious as to why Google would sell the facility to Vuzix, did they give up on making waveguides in-house? Also, why would Vuzix not publicly disclose who they bought it from? Whats the motive behind the secrecy?
I looks like Google decided that waveguides were not “strategic” in that they thought they could get them made by others. But at the same time, I don’t see why they thought MicroLEDs were strategic and needed them in-house.
I don’t know the magnitude in the cost of the two efforts. As I wrote in the article, you can bite off different amounts of effort with MicroLEDs depending on how much of the process you farm out to other companies. You could have just designers and farm out the LED making and LED to CMOS assembly to other companies or go in-between with some kind of joint investment.
In terms of the secrecy, I would assume that was something Google didn’t want to make public, more of a face saving for them. Google already developing a bad reputation for getting excited about something and then dropping it a few years later. I can’t believe it was a competitor issues such as keeping the information from Meta as they could not keep a secret with so many people in The Valley, including those from the facility that would leave.
That make sense. It also makes me wonder what benefit Vuzix saw from purchasing the facility, especially if Google is planning to use AMAT for their waveguides. If Google doesn’t believe it was strategic for them to keep the facility, then why is it a strategic plus for Vuzix to acquire it?
I think Vuzix saw it as an opportunity to acquire an R&D facility they could not afford to build for themselves. Google probably overspent building the lab with their free-flowing cash, whereas Vuzix is in the business of making waveguides for their glasses, so for them, the R&D facility contributes to the core products they are making.
I can make a general sense of why it is seen as a win-win for both companies. What I don’t understand is why Google would still think that Raxium/MicroLEDs should be done in-house by them. I would not be surprised to see them sell it off at some point (I don’t have any information or sources on this, I’m just trying to rationalize why they would sell one but not the other).
[…] More info […]