304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
It seems like every day I get ready to finish part 1 of my MicroLED series, another AR news story breaks. And not just any AR story but one that Apple may be announcing their AR glasses soon along with a very few details.
I’m not an avid Apple follower, nor had I heard of the leaker, Jon Prosser, before today. But as just about every on-line tech and business magazine picked up the story and said Prosser past leaks usually came true, it gives the leak an air of credibility. For the sake of this article, I am going to assume Prosser is correct in his statements and then try to figure out what Apple could be doing based on the information.
This blog is not about breaking news, but more about analysis of existing information. Google Glass was old news when this blog figured out that it was using a Himax LCOS Panel. Magic Leap had already raised over $1B and had hundreds of patents when this blog correctly figured out what they were doing about a year and a half before they shipped a product.
I don’t have enough to go on today to know what Apple is doing precisely. As I wrote back in December, Apple has not left much of a patent paper trail. BTW, I find that Patently Apple, at least in the case of AR, tends to over sensationalize what appears to me to be either trivial patents or things that I doubt Apple will bring to market. Either Apple is not filing significant patents, filing them under the name of a shell company, or they are buying the technology from someone else.
This is another, I want to get this out in a hurry article. Expect more than the usual number of typo’s. There will probably be some corrections in the upcoming days.
Today’s big news is that Front Page Tech’s (FPT) Youtuber Jon Prosser released a video giving information on the Apple Glass name, planned announcement date, price, and some minimal technical information on the product itself. I’m going to try and tease out some additional technical details from the scant information given.
In the figure on the left, I have combined the video’s essential information from three slides Prosser presented. I should add that Prosser stated:
I’m trying to get the green light legally to share a video of a prototype of these glasses that I’ve seen.
So maybe we will see a video soon, but it was not clear to me if it will be a working prototype or just a mockup. While Prosser may be a tech enthusiast and good at finding leaks, he may not know what to look for in an AR headset.
An AR expert with just one look of a functioning headset could tell a lot about the optics they are using. And within a minute or two of using it could figure out what display technology they are using. Listening to Prosser’s video several times, I’m not sure he has looked through a working unit. Maybe he has seen a working unit, but I would expect to know a lot more from someone that has seen a working unit.
For the rest of this article, I am going to go through the bullet points from the video, where I feel I have something to add. I’m going to try and do some “thought experiment reverse-engineering” to tease out information from between the lines of what Prosser wrote.
No surprise and nothing original, but it looks like we have a name. iGlass and iGlasses like iWatch were already taken.
Interestingly this is the exact same price as Nreal’s AR glasses that plugs in smartphone via USB-C cable for the processing. While Prosser never says so explicitly, almost certainly the Apple Glasses have wireless data (more wireless data later), and probably why Prosser didn’t bother to mention it.
The $499 price gives a lot of clues. Figure that if the Nreal headset were in the Apple store, it would likely sell for more than $1,000 with typical Apple margins. To meet typical Apple margins, Apple Glasses would need to have a bill of materials (BOM) about half that of Nreal’s consumer glasses. And Apple has wireless battery charging, a battery, and wireless communication that Nreal does not.
Apple is going to want much sleeker and inherently more expensive optics than Nreal. The Nreal birdbath optics (essentially the same as the OGG optics on the right – see here for more about how they work), while inexpensive with excellent image quality, are too bulky and probably block too much light for Apple’s liking. The thinner optics are likely not going to have nearly as good an image as a simple birdbath.
So Apple Glass has to support higher margins and inherently more expensive optics but sell at the same price. Something has to give, and it is likely in terms of resolution, field of view (FOV), and image quality when compared to Nreal. It suggests that Apple Glass is going to be somewhere between Google Glass and Nreal when it comes to the image the user sees. I’m expecting the display resolution to be between 400 x300 and 800×600 pixels in resolution. Everything to me suggests that Apple is going for “style” over image quality.
I tend to doubt that Apple will use waveguides like Hololens (1&2), Magic Leap, WaveOptics, Dispelix, and Vuzix. Given the available information, I would favor some form of thin freeform optics like what has been done by Tooz or by the lesser know startup, Oorym. Tooz spun out of Carl Zeiss that back in 2017 was rumored to be working with Apple. I’m not saying that it was either Tooz or Oorym specifically, but that the optic might be something similar to what these companies are doing. Freeform optics are going to be much more optically efficient than say diffractive waveguides, which translates into less power, which saves on heat and battery. Freeform optics are bigger for a given resolution, but if Apple’s resolution and FOV goals are modest (i.e. small), freeform optics would seem to be the best option.
Then we have the comment on cost about being “plus prescription.” This immediately says that Apple’s Glasses are not going to have the large eye relief of say Hololens that would let someone wear their own glasses. The cheap but ugly way, not the “Apple way, would be to add prescription inserts like Nreal and Magic Leap, as examples. By process of elimination, this leaves building the prescriptions into the optics. Once again, this tends to point to thin freeform optics. Tooz, for instance, supports building the prescription into their optics, and Oorym claims they will be able to support prescription lenses.
I don’t have a much to go on in terms of the display type. I’m leaning to some form of flat panel, either a small OLED or a tiny pixel size (~20-micron pixel pitch) LCD. It could also be something like a Frontlit Himax LCOS device. When you are talking a modest resolution and FOV, there are a lot of options.
Quoting the video (from auto-transcript with minor corrections and punctuation):
All data processing is going to happen on your iPhone, so similar to what we saw with the first gen Apple watch. That’s sort of what we can expect here with Apple Glass. Some of you might be surprised and bummed by that. But it makes sense. I know a lot of people are like expecting some sort of crazy holograph like Hololens type experience. Some life-changing device, so I want to sort of temper your expectations here, especially for the first generation product. But listen, it makes sense. Data processing happening, and depending on the iPhone in your pocket, that price point of $499 makes a lot more sense now. Anything more than that would probably increase the price of these things quite a bit.
Everything said above makes sense. Apple Glass is likely to be closer to an Apple Watch (448 by 368 pixels) in terms of display resolution than say an iPhone (1792 x 828 pixels). Apple Glass is likely to be more of a “data snacking” and for playing lower resolution Pokemon-Go-like games. So they can more than get away with using the phone’s processor.
The “look like glasses” tends to support the idea above that the prescription is built into the glasses. Supporting prescriptions probably works against the many “uni-lens” concepts I have seen. It would be a developmental and logistical nightmare to have a “uni-lens” with prescriptions.
Despite all the “shaming” and privacy concerns, I can’t see this type of product being successful without the ability to take pictures. I can believe Apple would do it just like they wouldn’t make a large iPhone for so many years (sometimes Apple outthinks itself). If they make it past Gen 1, then expect a camera on Gen 2.
Putting the LiDAR on the right temple and would seem to be worse for left-handed people (about 10% of the population). Also, gesture input can be very problematic in real-world situations.
As mentioned above, not only does this say there is a battery; it also implies that there is wireless communication to the iPhone. Like the Apple Watch, it will probably use a combination of Bluetooth and WiFi. But wireless communication takes a lot of power per bit, so sending video tends to be power-hungry. I would not be expecting to watch movies on Apple Glasses. Maybe a low-resolution video clip occasionally but not a lot more.
It sounds so simple to put “displays in both lenses” until you realize the ramifications. It was this one statement that made me want to write this article. It may sound like supporting two eyes is just twice as expensive with a display for two eyes. But it is both more expensive and much more complicated because you have to deal with getting the two images in the right place for stereo vision.
Consider the variability in human interpupillary distances (IPDs). See, for example, table 2 linked to here. To cover the 1% percentile of adult women to 99% percentile, men require a range of 50 to 75mm. Dividing by 2 for the two eyes, and they need about 12.5m mm of adjustment. With 95% percental of women and men (57mm to 71mm), there needs to be ~7mm of adjustment per eye.
What a person perceives at depth/distance with stereo vision is a function of the difference in the image between the two eyes. If two people have different IPDs, then the location of the images in the optics has to be based on their eye locations or they will perceive the image to be at different distances or in not see the image at all. Dealing with different in IPD has a few options:
The fundamental problems are that if you are saving cost on making the optics to meet a price point, then supporting a large eye box to support a wide range of electronic IPD adjustment is not possible. Pick your poison, #1 is ugly and unstable, and very “non-Apple,” #2 is too big and expensive, and #3 is a logistics nightmare. I would guess they would have to pick #1.
If I had one question I would like to see answered; it would be “how is Apple Glass going to support dual displays for people with different IPD?”
I give, this makes no sense unless they are just talking a mechanical issue, or they are talking about polarized sunglasses that can cause issues with some optics. Simple neutral gray sunglasses should work with any display. I think there is some information missing in Prosser’s statement.
I know I’m going to get the usual “it’s Apple” and “Moores Law” (see Saying “Moore’s Law or Apple” Does Not Make AR a Consumer Product). What I tried to do above is take the information and make it fit within the known bounds of physics and economics. It looks to me that to get to a $499 price point with dual displays, Apple is would have to do some extremely non-Apple like things in terms of quality and margins.
You can figure out a lot by just looking at an AR headset. How dark are the lenses? If it looks like Nreal, then it has a birdbath combiner. You can often see the size of the eye box. You can see how the vision correction is supported (say with an insert) and how it will support IPD adjustment. All AR optics will have telltale signs of how the display image is combined with the real world. Diffraction gratings, for example, will show up as a rainbow of colors. Embedded mirrors can be seen if you look from the right angle.
I will usually take turning the device on to figure out the type of display. Field sequential color indicative of LCOS or DLP will be seen if the eyes move or you move the headset. You can generally tell from the content, the rough resolution, and FOV. If you see a screen-door effect, then you are probably looking at a display with large pixel pitches. The contrast of the display can also be another clue; OLED displays usually have higher contrast.
Then, of course, if you can ask the right people, it comes down to asking the right questions. Hopefully, this article gives the reader some clues at to what to ask.