Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124


The recently announced Even Realities G1 (ER G1) glasses (right) enter a somewhat crowded field of glasses form factor AR glasses using Jade Bird Display’s (JBD) 640×480 green (only) MicroLEDs.
While I have seen many of the other glasses with my own eyes, I have only seen the ER G1 glasses in the videos, including a video conference with Even Realities CEO and founder Will Wang where he answered questions for about an hour. The G1 makes an interesting case study regarding the features they supported and, perhaps more importantly, the apparent design trade-offs.

The G1 stands out as the first I have seen to integrate prescription lenses while coming closest to having a form factor that looks like ordinary glasses, with high transparency and “eye glow” control (discussed later). Overall, Even Reality has focused on a more minimalist approach. The glasses cost $599US plus prescriptions (if necessary), which start at $150US for simple far sight-only correction. Even Realities’ prescription glasses can correct for astigmatism, and progressive lenses will be in some regions (at additional cost).
Many companies have shown products using JBD’s MicroLEDs in a glasses-like form factor. They include Vuzix Z100 (monocular), Meta Bounds Mojie, LAWK Metalens, MyVu, INMO Go, and Oppo Air 2. Below are pictures of AR glasses shown in JBD’s SID Display Week 2024 booth.

While I have seen through most of the above with my own eyes, I have only seen the ER G1 glasses in the videos, including a video conference with Even Realities CEO and founder Will Wang where he answered questions for about an hour. The G1 makes an interesting case study regarding the features they supported and, perhaps more importantly, the apparent design trade-offs.
The TCL Ray Neo X2 (3-chip X cube full-color) and Vuzix Shield (biocular, green only), while they use JBD MicroLEDs, are in another category with 8-core CPUs running Android (others have minimal processing just for Bluetooth communication), WiFi, cameras, and other features not found in the other JBD AR glasses. However, this added capability comes at the cost of thicker and heavier frames to support the added feature, including the batteries that drive them. TCL RayNeo X2 and Ray Neo X2 Lite
For some of my discussion of MicroLED AR Glasses, see DigiLens, Lumus, Vuzix, Oppo, & Avegant Optical AR (CES & AR/VR/MR 2023 Pt. 8), TCL RayNeo X2 and Ray Neo X2 Lite, CES (Pt. 2), Sony XR, DigiLens, Vuzix, Solos, Xander, EverySight, Mojie, TCL color µLED, Mixed Reality at CES and the AR/VR/MR 2024 Video (Part 1 – Headset Companies), and AWE 2024 Panel: The Current State and Future Direction of AR Glasses.
TT Technology did a YouTube review (and disclosed up front that Even Realities had sponsored it) with multiple videos taken via a smartphone camera looking through the optics. One of these videos gives a rough idea of the image size and location within the FOV (below). The relative size of the virtual image to the glasses frame opening might be a bit off due to the camera not being located in the same place as the eye, but it gives a good idea of how much content can be displayed and the size within the FOV. One thing to notice is that the virtual image is wide and not very tall. As it turns out (later), the ER G1 optics/waveguide only displays about 1/3rd of the height of the JBD 640×480 Microdisplay.

Many, if not most, designers underestimate the importance of not disturbing the view when the display is off for something intended to be “all-day wearable” (the issues for specific task-oriented headsets are different). Designers are focused on providing the benefits of the virtual display without considering the impact. People are not going to wear something all day that hampers their vision. This hampering could be in the form of darkening, reducing sharpness (as can happen when looking through gratings or other combiner optics), or light capture artifacts (ex., “rainbows”). For the G1, this appears to come at the expense of the virtual image size, although it also has some size and weight benefits.

The Vuzix Z100 provides an interesting contrast in design decision to the ER G1. At AWE 2024 panel, The Current State and Future Direction of AR Glasses, discussed glass-form-factor (optical) AR glasses that are wirelessly connected to a smartphone. In the Q&A session, Thad Starner, a wearer of AR glasses since 1993, one of the architects of Google Glass, and a Georgia Tech Professor who has long been a proponent of minimalist AR Glasses (see also my 2017 article FOV Obsession), brought up the minimalist issue during the Q&A session (at 49:46 in the video).
At AWE 2024, Thad was wearing the Vuzix Z100 (right), which seems to meet his requirements and is similar in size and functionality with a few notable differences. Both are glass form factors, with the Vuzix having a more “traditional” style frame and the G1 having an arguably more modern look. Both weigh about the same.
At CES 2024, Vuzix demonstrated a new Ultralite S (Sport) design (below right), which was a bit sleeker than the Z100. Vuzix also said they were better at controlling the “eye glow” with this model. I confirmed that the Ultralite S uses a waveguide designed for pantoscopic tilt (to be discussed later), perhaps among other design improvements.
The Vuzix Z100 (and Ultralight S) and the ER G1 use Jade Bird Displays green (only) 640×480 MicroLED displays, diffractive waveguides, and a Bluetooth wireless connection to a smartphone. The ER G1, as discussed, cuts off more than 1/3rd of the vertical pixels. The Z100 and Ultralight S are both monocular.


With his 30+ years of experience, Thad Starner has long stated his preference for a narrower FOV (25 to 30 degrees) monocular display that appears offset to the outside/temple of the user’s straight-ahead view so that notices or the like doesn’t block normal vision. This is the case for the Vuzix Z100 he wore at AWE 2024. Starner was a technical leader of Google Glass, which positioned the display above and on the temple side of the center of view.
Starner argues that the most important thing for an AR display to do is not to block forward vision. In the worst case, it could be dangerous for a notification to block a person’s view at a critical time.
The ER G1 positions the biocular display image above the user’s straight-ahead view so that the virtual images don’t block the critical straight-ahead view. However, in doing so, the waveguide (and optics) cut off most of the vertical pixels from the display device. The G1 does not appear to use all the display’s 640 pixels horizontally for information, which would allow the display to be moved horizontally to adjust for things like IPD.
Having worked with monocular and binocular headsets, I have found that most people will find binocular displays take less time to get used to and thus are more “consumer friendly.” But monocular still works if you can get it on the user’s “dominant eye” (one eye tends to dominate).
It would be ergonomically better if the virtual images were located below the center of vision rather than above it, as human eye muscles are better at looking down than up. Then, to avoid interference with forward vision, Even Realities has cut off the vertical image height by more than 3X.
However, moving the display/exit grating to the bottom means larger areas of diffraction gratings that would cause light-capturing (“rainbow”) issues. Another issue is that if progressive prescription correction is used, it will put the “far” vision at the top and the “near” vision at the bottom, but the output of waveguides will still appear to be focused far away and thus not in focus if put in the lower area of progressive lenses. Additionally, locating the virtual image below the center of forward vision tends to block when someone is looking downward to read.
I’m not criticizing Even Realities’ decisions but pointing out the many design trade-offs in something as simple as the location of the image. Even Realities made consistent decisions, including accepting the loss of more than one-third of the display’s pixels, which reduced power consumption and heat dissipation, contributing to reducing size, weight, and cost.
The G1s stand out today because they support integrated (optically glued to the waveguide) prescription lenses; all others require snap-in or clip-on prescription inserts to correct vision. This approach is thinner, cleaner-looking, and lighter and eliminates reflection from air gaps between the vision correction lens and the waveguide. But then, if your prescriptions change, you must buy a new set of glasses. Simple prescriptions for nearsighted correction only cost an additional $150US, with astigmatism and progressive costing an unspecified amount more.

The Inserts for the Vuzix Z100 snap into the frames, which gives a more integrated look than those with clip-on corrections. If the prescription changes, only the inserts need to be replaced, or if the smart glasses break, the lenses can be swapped to a working unit. The insert approach would work better in some “enterprise” applications where smart glasses can be shared. However, the snap-in approach will not be as clean-looking as the integrated approach of the G1 and will be more prone to reflections between the waveguide and inserts. The company Lensology makes similar snap-in inserts for TCL’s RanNeo X2.

G1 also uses pantoscopic tilt (right) to direct the diffractive order that causes the “eye glow” downward. In my video conference with Even Realities CEO Will Wang, I could only tell his glasses were turn-on when he tilted his head.
The G1 AR glasses look very clean, and the diffraction gratings seem nearly invisible when looking straight on. The TT Technology YouTube video discussed earlier also had many video segments of the G1 looking at the wearer while the G1 was active. The stills captured from the video below show how the G1 looks from various angles. Looking straight on (left), there is no eye glow, and the waveguide exit grating seems invisible. Looking from the side, and dependent on the lighting, you can see the grating (center). But the eye glow can be seen when the head tilts back far enough (right).

This “trick” is not new. Digilens, Vuzix in the Z100, and Displelix, among others, also use pantoscopic tilt to eliminate eye glow, but many others, including the TLC Rayneo X2, don’t.
Below are some figures from Digitlens and Displelix on the issue and how designing the optics for a tilted waveguide causes the “leakage light/eye glow” to be redirected downward. With diffractive waveguides, for each degree the lenses tilt, the eye glow light is directed downward by two degrees. With this 2X multiplier, it only takes a few degrees of tilt to direct the light to where others won’t normally be able to see it. An observer typically will only see the eye glow if the wearer tilts their head significantly. As Displelix pointed out (below right), other diffractive design factors can also reduce the leakage.


Even Realities claims that they have designed their waveguides to reduce the diffractive waveguide light capture (commonly referred to as “rainbows” – see right), where the diffraction gratings direct light sources (like overhead lights) into the eye. The same diffractive “physics” that makes waveguides work makes them prone to capturing light at certain angles from the real world and directing it into the eye. Early diffractive waveguides like the Magic Leap One and the first Hololens had major rainbow problems.

Everyone is working on diffraction grating designs and optical coatings to reduce the rainbow issue. Still, without having everyone’s glasses side by side for testing, it is hard to know which is doing the best. The G1 does not appear to have an exposed “expansion grating,” unlike most other waveguides, including the Oppo Air 2 (right), which can contribute to visible rainbow effects.

The Even Reality G1 has very high transparency (better than 90%) and appears more transparent than other monochrome (green) waveguides I have seen. The G1’s waveguides seem nearly invisible in a front-on view (top-right), but as noted earlier, they can be seen by an observer when the glasses are at an angle.
Shown (mid-right) is me at CES 2024, wearing the Vuzix Ultralight S. I’ve added a dotted red line around the waveguide, indicating the boundary of the waveguide’s exit grating (where the image light exits). On the Vuzix Ultralight S, a slight darkening in the area of the exit grating is visible, unlike the G1 in this straight-on view. Also, in the Ultralite S, you can see that at least some of the exit grating is in the forwarded vision, which is good for seeing the virtual display but might cause some diffraction issues when the display is off.
The G1’s exit grating can be seen in the right lighting, such as this view of the G1 in its charger case (bottom-right). Notice that the G1’s exit grating is more than 2x wider horizontally than vertically. The JBD MicroLEDs that both Even Realites, Vuzix (and many other) companies use have 640×480 pixels and thus a 4:3 aspect ratio. The 4:3 aspect ratio can be seen in the Vuzix Ultralite S (above-mid-right) and the Oppo glasses (below, from AR/VR/MR 2023), which use the same JBD green MicroLED.

Also, notice the 4:3 aspect ratio and amount of content in the through-the-optics picture on the right. It was taken handheld with an Olympus D5 Mk. III camera at AR/VR/MR 2023. The ER G1 uses less than 1/3 of the JBD’s display’s 640×480 pixels.

In looking at both Even Realities “simulated” and actual through-the-optics images, I was struck by the lack of content and wide and short aspect ratio. ER G1 displays about a 3.6 to 1 versus the display’s 4:3 aspect ratio or 1.333 to 1. So I decided to investigate. The best through-the-optics image I could find online was in a Jon Rettinger YouTube review (right).
Jon Rettinger’s picture was taken handheld with a smartphone and was not aligned and focused well. Still, I could make out some pixels and other details. I use a very high-resolution picture through a different waveguide of the JBD 640×480 green MicroLED to “calibrate” the pixel sizes in Regittinger’s ER G1 through the optics picture.
In the comparison below, I have removed (most) of the geometric distortion in the picture, enhanced the image slightly to see the pixels better, and scaled the image to match the scale of the higher-resolution image of the whole JBD display. The net result is that it looks like the ERG1 is only displaying about 550 by 150 pixels, or ~86% of the horizontal pixels and ~31% of the possible vertical pixels.

Even Realities demos use larger fonts than many others (ex., see the Oppo image shown earlier). They seem to be trying to keep the information presented to the user in small, easily digested amounts.
Even Realities’ website and videos show application examples that appear to be “simulated” (few mark their images as simulated anymore). It looks like the virtual display image (which is a little “too good”) is being composited on a computer on top of a photograph or video, not through optics. Still, the level of content and resolution is consistent with the capabilities demonstrated in the through-the-optics videos and pictures I have found.

Some of the applications are shown on the right. The teleprompter mode uses “AI” to scroll the text” and fade words that have been spoken.
They have classic “notifications.” displays. Like every other “AI” glasses, both audio-only and with displays, the ER G1 plans to support language translation. Language translation should work much better with a display than with audio translation alone (which was discussed in our panel at AWE 2024).
The other well-known potential application for AR glasses has been navigation. In this area, the limitations due to the lack of resolution and color seem obvious compared to smartphones or smartwatches. It might work well with a cell phone map to limit your need to refer to the smartphone. Caution: Driving with AR glasses, at least turned on, today would probably be dangerous (and perhaps illegal in many states and countries). So this application is for people who are on foot or as passengers.
Below are some frame captures from TT Technologies’ YouTube video taken through the optics. As stated above, the content level is consistent with the simulated images by Even Realities.

A key issue with all these applications is whether they will provide enough functionality for regular/everyday use in a world with smartwatches and smartphones. For example, I typically fly out of the country once or twice a year, and the translation might be useful. With a smartphone, I can use it for translation, including taking pictures of signage and translating them (something the G1 can do since it does not have cameras) on my smartphone. It’s not worth it to me to pay ~$750 for another product to do AI translation; I can look down at my smartphone for the few occasions I need it. It might be a different case for someone immersed for a long time in a foreign country or an executive who wants to know what someone is saying in a foreign language in a business meeting.
The ER G1 is a minimalist approach to AI/AR glasses. Most obviously missing are cameras found in audio-only AI Glasses, such as the Meta Ray Ban Wayfarer. The only input to the “AI” is via audio. Lacking video eliminates a whole set of proposed “AI” based applications. However, adding cameras to the G1 would mean more space, weight, and power due to the cameras and the need for higher data rates back to the smartphone. Adding cameras would likely push the weight above 40 grams (Even Realities has not given a precise number but has said the G1 is less than 40 grams).
The “Glassholes” argument about not having a camera on AR glasses is overblown. Cameras are everywhere these days and I think it was just a lazy excuse. There were many other problems with Google Glass, including the looks, display performance, and functionality, not to mention the vastly overblown marketing that grossly overstated their capability.
In the “normal glasses,” form factor and weight are out of the question, as are supporting gesture recognition and SLAM. The G1 relies on a capacitive touch sensor and voice recognition.
With a very low-resolution green-only image and a low data rate, the G1 is unsuitable for watching movies or looking at pictures. I’m not sure the G1 would even support pictures and movies. It’s a very bare-bones data-snacking AR device.
I must credit Even Realities for adhering to their minimalist principles and developing consistent features. It is an example of the trade-offs that must be made to keep AR glasses under 40 grams.
I also appreciate why (I think) they chopped the waveguide and, therefore, the display height to prevent the diffraction grating from interfering with the direct forward view. Each alternative display location to the G1 has a different set of drawbacks. I find the thought experiment of where to put the display in the FOV and the resultant issues illustrative of how tight a balance must be with AR glasses.
Integrating (gluing) prescription lenses into glasses has advantages for an all-day wearable product. But it is not without its problems, including the issue of what happens when a user’s prescription changes and the ability for multiple people to try the same AR glasses.
When you buy the product plus prescriptions, the price is about $750 (or maybe more for more complex prescriptions). What you are getting is a low-resolution green-only display that, other than being always available in front of you, can’t compete with a smartwatch in terms of display capability.
For a consumer product, full color is more important than it would be for an “enterprise” product. As a leader of IBM’s computer monitor division said in the early 1980s during a debate over higher-resolution monochrome (up to ~2K by ~1K) versus lower-resolution color monitors (~640×480 at that time), “Color is the least necessary and most desired feature in a monitor.” Monochrome displays are often still used in military and industrial applications. And we saw this again as smartwatches moved from monochrome to color. Consumers still much prefer color even if it is not “necessary.” Even with low-resolution, more information displays, color has functional advantages including simple things such as red=bad/warning and green=good.
Even Realities says that their presales are greatly exceeding expectations. We will have to wait and see if this keeps up as the product goes into production and whether users find it useful. Everyone in this field expects Meta and many others to introduce “AI” AR (with display) Glasses that support audio and camera input and audio and display output. So we are about to see the “experiment run” with various design features and trade-offs. As stated previously, I understand that cameras would drive up the weight and power, but of all the trade-offs, Even Realities made with the G1, leaving out camera(s) is the issue I would most second guess.
Thank you for this wonderful review. I have shared it on Reddit so more people can see it as well! Many Reddit Users on EvenRealities’ Thread including myself are early adopters ready to test out these glasses as they ship.
Thanks,
The article was not meant to be a review but rather an comparative analysis of design decisions and trade-offs with AR Glasses. The Even Realities G1 was more a a vehicle to explore the tradeoffs they made. Think of AR glasses as a tiny “ecosystem” with a very delicate balance. If you try and do everything, the glasses become big, bulky, heavy and expensive. So the design challenge becomes more of what do you cut out.
Do you know if the teleprompt ai run with different language ( like French) ? Thank you
No idea.
Karl, do you know how is the battery?
I’m not sure what you are asking about the battery.
Even Realities claims the battery is good for more than a day of “typical” use. It needs to be, because you have to take the glasses off to put in the charging case.
In terms of the type of battery, it looks like there are two polymer-lithium battery that fills up most of the bulging ends of the frames. You can see one of the batteries in an Even Reality video (link to where the battery can be seen https://youtu.be/tBH7mczkIJY?t=271)
Nice to see you covering this Karl. Look forward to a follow up when or if you get your pair 😉
[…] Learn more […]
Is this something that deaf people can wear and have subtitles?
One company I know that is aimed at this application is XanderGlasses (https://www.xanderglasses.com/xanderglasses) which is based on Vuzix’s green MicroLEDs (https://www.vuzix.com/blogs/press-releases/vuzix-nasdaq-vuzi-continues-to-receive-smart-glasses-reorders-from-xander-to-meet-rising-demand-for-their-award-winning-captioning-glasses-for-the-hard-of-hearing).
I saw them at CES 2024, but didn’t get to spend any time with them.
I like to test this how can able get this
They can be bought on the Even Realities Website (https://www.evenrealities.com/) for $599US, but what is available, particularly in terms of prescription lenses, may depend on your country.
I’d like to know how it would fit for expats. so, people living in a non native country are usually struglling with the local language. i’m living in germany for 4 years now, have a good knowledge of german by now in understanding around me, but speaking up is a lot harder. so I’d love to see more development in not only in real time translation of what others say, but also what can you say in a grammatically correct way… that would be a game changer….
Hi Karl,
You mentioned that you can’t see the “expansion grating”. Does this mean that Even Realities waveguide are designed with a 2D pupil expansion in a single grating? That is not very common, right? Waveoptics/Snap are using it… anyone else that you know of?
You are correct about the 2-D (WaveOptics-like) waveguide. I interviewed them at CES a few months after the article. They said it was to to reduce the light capture. As I remember it, they said that as the light enters from the temple from the top, it limited the vertical expansion with this type of waveguide and thus limited the vertical FOV which is why they don’t use the JBD MicroLED full vertical resolution. You may note that Snap Spectacles have very wide temples so the the light can enter much lower.
You are also correct that it is not very common. Soon after writing about Even Realities, I wrote about Orion using a similar 2-D waveguide structure and Magic Leap was looking at it. I wrote about this in https://kguttag.com/2024/10/17/meta-orion-ar-pt-2-orion-vs-wave-optics-snap-and-magic-leap-waveguides/. Orion’s light path expands diagonally and has more oval shaped expansion/exit gratings. Whereas Snap/WaveOptics waveguides are single sided, Orion likely has gratings on both sides of the waveguide. As my article shows, Snap has also considered having expansion/exit gratings on both sides of the waveguide.
I would expect with Even Realities, Snap, and Meta Orion using 2-D expanding waveguides, we will like see more soon. There may be to be con’s to this type of waveguide including front light projection (eye glow) and image color uniformity.
Karl
[…] lenses or snap-on prescription lenses. Each one has its own good points and differences. The table below helps you see how they […]