Apple Glass Leak (Part 3) – Revenue Streams, There Has to be a Camera, and All-Day Wear

Introduction – Building on Prosser’s Leak

This article builds on Part 1 and Part 2 in this series on Jon Prosser’s Apple Glass Leak.

Waveguides shows “glint” light reflection

I want to caution everyone, that I am trying to “fit” available technical information to what noted Apple “leaker” Jon Prosser reported in a May 29, 2020, YouTube video. Unfortunately, Prosser does not even give a detailed description, such as whether the lenses were curved or flat, was there a front shield, was there “glints” of color (see right). Prosser claims to have as yet unreleased videos, which likely will give further clues if not definitely indicate the technology being used. I’m trying to decode a puzzle from limit information to make the “fit.”

While I might come to different conclusions with more information, this process helps close on the answer faster by doing the thought experiments. In writing down my thoughts in a public forum, it makes me think through the issues and find missing elements. As this blog has an excellent reach in the AR community, it can also help turn up more clues.

Hololens 2 Glowing Eyes from “Front Projection”

For AR glasses to become widespread, most would agree that they should not make the wearer look like a cyborg with glowing eyes (see left) that show what the user is watching. I have not yet discussed Prosser’s verbal comment, “you couldn’t tell that the lenses were displaying anything,” which is an interesting “tell” about the optics.

As I mentioned last time, I’m curious as to how Apple is going to make Apple-like profit margins with Apple Glass. Creating a dual eye headset that is sleek enough to look like glasses and have Apple sell it for $499 (as Prosser reported) on the surface does not add up. To make this “fit,” Apple must have a plan that taps into other revenue streams.

For the revenue stream concept to work, requires the product has to be wearable all-day, a focus of North (maker of “Focals”). North makes some good arguments for what it takes to be “all-day wearable” glasses that I would like to discuss in the context of Apple Glass.

Software and Other Revenue

The AR gaming revenues are the most obvious form of “secondary revenue” (revenue from selling add-on software/games, peripherals, advertising, and data). Last I heard, Apple was still getting 30% of all the income from Pokemon Go related purchases at the Apple store. Software for enterprise and consumer applications could be sold directly by Apple or generate revenue via their app store.

Based on Prosser’s comments, Apple is not trying to come close to making a highly immersive AR gaming platform. The display is simpler, and the processing depends on the phone. Quoting Prosser’s video @07:01:

I know a lot of people are like expecting some sort of crazy holograph like Hololens type experience some life-changing device. So I want to sort of temper your expectations here especially for the first generation product

Many AR applications don’t want the cost and the bulk of, say, a Hololens 2. But the most significant profit potential could come from shaving a smaller percentage but off a larger amount of what the user buys. Still, the headset needs to be reasonably affordable and worn regularly to enable significant secondary revenue.

Apple Glass Instead of Amazon Echo and Go

My May 22nd article ended with: “Setting the Price at $499 suggests that Apple wants to move considerable numbers of units versus making money on each unit. Apple appears to be thinking in terms of a secondary revenue stream.” Robert Scoble referenced my blog article and went on to discuss some of the applications in a May 25th Linkedin article, “Can Apple sell Glasses for cost? Yes, here’s how it will turn LIDAR, AR, QR, and ApplePay into major profits.” As Scoble suggests, there could be QR codes. Still, then again, by using LiDAR as Prosser’s video suggests (I think there must also be a camera(s) – more on that later), Apple Glass could identify as Scoble suggests, like Amazon Go stores, but with inside-out rather than outside-in product scanning.

Vuzix at CES 2019 had a section of their booth showing “A Day in the Life of Vuzix Blade,” as shown in a video by Charbax (right), which gives just a rough glimpse of how powerful it could be to wear glasses that can “scan” the world. This concept could be even more capable by combining LiDAR with a visual camera(s).

If you can see it, Apple could let you buy it with Apple pocketing a commission. It is easy to imagine putting items in your shopping cart and have them tallied up so you just walk out of the store at the end.

Apple’s HomePod appears to have failed compared to Amazon’s Echo with the concept of ordering from your smart speaker. But what if everyone with Apple Glass could buy anything anyplace by just looking at it? And not only in a structured environment like a store. See someone wearing a pair of shoes would like, and with a hand gesture, it is on the way to your house in your size via Apple Pay.

Really, No Camera?

There is one gigantic problem with the shopping scenario above and that was Prosser’s comment there is no camera due to privacy concerns. Quoting Prosser’s video @10:38:

No cameras actually in it due to privacy. Who knows we might see cameras in the final version but I again I didn’t see it unless you count a LiDAR sensor as a camera.

It is hard to believe that Apple builds AR Glasses at such a low (perhaps negative) margin and then would cripple the product in a way that would both make it less desirable while hampering, if not killing, a major source of revenue. The need for a camera on a headset is much greater than on a phone. Prosser, in the comment above, leaves open the possibility of a camera in the final version.

The “Glassholes,” from Google Glass, has more to do with showing-off expensive new technology than privacy in a world with cameras everywhere, including on the phone. Ironically, it would be like how Sony’s Walkman division, hamstrung by Sony’s 1988 purchase of CBS/Columbia Records, focusing on copy protection over ease of use that left the portable music player open to Apple’s iPod. Quoting from The New York Times 2005 article, “How the iPod Ran Circles Around the Walkman” with my bold emphasis:

. . . Sony had gone Hollywood. Flush with profits generated in no small measure by the Walkman, and taking advantage of the strong Japanese yen, Sony acquired CBS Records for $2 billion in 1988 and Columbia Pictures for $3.4 billion the next year. . . .

. . . At Sony, having both digital players and music in the same corporate family has actually been detrimental to its hardware interests. The music label directed the hardware group to make copying impossible, to the extent that until recently [written in 2005], customers could not enjoy on their Walkmans the music from their own legally bought CD’s that they had encoded in MP3 format.

I have a hard time believing that Apple would so outthink themselves as to not include cameras on a headset when they put at least four on their pro-model phones. I choose to think, maybe incorrectly, that Prosser saw a prototype without all the capabilities of the final product.

Typical LiDARs Don’t Make Good 2-D Scanners and Don’t Recognize Color

Prosser makes a bit point that Apple is putting a “heavy, heavy” focus on scanning Apple QR codes. Quoting from his video at @11:23:

With what I’ve seen a heavy heavy focus on scanning what looks like proprietary Apple QR codes.

The “proprietary Apple QR codes” would likely be the “Cosmic” style ones in “Leaked pics from Apple’s AR app Gobi,” by Josh Constine on May 18th, 2020.

First, there is the basic technical issue, namely that LiDAR scanners typically aren’t designed to scanning bar codes, but rather 3-D cloud fields. It may be possible that Apple could have a special LiDAR scanner or a separate scanner.

But it would likely cost a lot more than a cheap camera that could do a better job. Lidar would not seem like the best technology to use to read the Proprietary Apple Codes from the Gobi Leak. And if the codes are in color and not just cosmetic, then LiDAR, with its infrared scanning, would be blind to the color variations.

The combination of LiDAR depth information with camera images would improve identifying things in the real world. Scandy is a company that makes such a product for iPads and iPhones primarily for capture. But something like Scandy combined with AI software could also be used to identify products in the real world.

Important Clue – Prosser, “You couldn’t tell that the lenses were displaying anything

While he did not list it as a “feature,” one of the most interesting verbal comments came @11:34 in Prosser’s video:

Referring to specifically what I’ve seen here you couldn’t tell like if you’re looking at somebody wearing the frames you couldn’t tell that the lenses were displaying anything. Only if you are the wearer could you actually see what’s being displayed

This image has an empty alt attribute; its file name is hololens-diffraction-grating.jpg
Nokia/Hololens Diffraction Grating

His comments are important both technically and concerning the market viability of a consumer product. As discussed in my last blog entry, holographic elements of Akonia’s (acquired by Apple), generally function similarly to diffraction gratings. A classic problem seen with both diffractive and holograph based waveguides is that the grating bends the light, it also splits it into “orders” as shown in the figure from the Nokia patent (used in Hololens) on the left.

This image has an empty alt attribute; its file name is HL2-CNET-Front-View-of-Waveguides-01.jpg
Hololens 2 “Glowing Eyes”

In the case of Hololens and Magic Leap (also using diffraction gratings), about as much light projects forward as is projected toward the eyes. The result is “glowing eyes” as seen in pictures and videos (right).

I’m told that the forward-projection effect can be significantly reduced, with both diffraction gratings and holographic elements. Still, they have not said if there are undesirable side effects as a result. Digilens, which also uses a form of Holographic elements, talks about “suppressed eye glow” in their recent Visual V1 brochure.

Dual Display Causes So Many Issues

As I discussed at length in Part 1, Prosser’s “Displays in BOTH lenses” seems to be the hardest part to fathom. Dual displays require a huge eye box to support interpupillary distance (IPD) adjustment, which in turn means larger and much more expensive optics, as discussed in Part 2. The combination of dual displays and large eye boxes results in much more than 2X the power consumption, which in turn requires a bigger battery and heat management. Even with relatively simple demonstrations, the issues caused by dual displays are hard to underestimate North’s presentation discusses (later).

Making AR “Glasses” Is Hard

I think the following figure from Starts with Ray-Ban®, Ends Up Like Hololens, summarizes what tends to happen with AR headsets. The expectations are for something like Ray-Ban (or Oakley) glasses, with very wide FOV, and they end up with something like a helmet (Hololens). It will be interesting to see how and if Apple has avoided spiraling away from a glasses-like design.

This image has an empty alt attribute; its file name is AR-Tree-Swing-001-1024x643.jpg

Daniel Wagner, formerly CTO of Daqri and currently Director of Software Engineering at Snap, along with Louahab Noui, and Adrian Stannard wrote an excellent Linkedin article, Why is making good AR displays so hard? I have copied a table from that article and added my comments in blue between square-brackets below.

  • Field of view
  • Eye box size [Makes seeing image easier but affects battery size/weight and heat management]
  • Brightness, transparency and duty time
  • Contrast [In AR, the transparency of the “black” and “picture frame”]
  • Uniformity & color quality and Chromatic aberrations
  • Resolution [Angular and number of pixels]
  • Real-world distortions and Virtual image distortions
  • Eye safety
  • Eye relief
  • Peripheral vision [Blocking vision is a safety issue]
  • Depth perception [and Vergence-Accommodation Conflict (VAC)]
  • Size, weight & form-factor
  • Optical efficiency [which affects battery size/weight and heat management]
  • Latency
  • Stray light
  • [Ruggedness (drop and scratch for front and back surfaces)]
  • [Safety issues due to a lack of “situational awareness“]

Some Lessons from Digilens v1

Very recently, recent Digilens released a Visual V1 brochure showing their concept for a dual display lightweight glasses that might give some insights into what Apple Glass might require. I have copied a few key figures from that brochure below.

This image has an empty alt attribute; its file name is Digilens-horz-1024x193.jpg

Digilen’s concept shows a removable display, maybe a workaround to the weight issue, as the two displays can be removed when not needed. While this may cut the weight, it also reduced the utility of the glasses. You can also see in the central figure a front shield to protect fragile waveguides. Waveguides cannot tolerate scratches, so they need to be protected. So if Apple is using Akonia waveguides, they will need some kind of shield.

The Digilens v1 has clipped-in corrective lenses, which many others, including Magic Leap and Nreal, are using. Apple might embed the waveguides inside corrective lenses to look like ordinary glasses. The basic idea of integrating the waveguide in a corrective lens is not new. For illustrative purposes, North (formerly North Focals) recently filed for a WPO patent application showing a waveguide embedded in a lens (see right).

Embedding the waveguide in the corrective (or not) lens addresses the issue of protecting the waveguide and eliminates the external corrective lens and may save on weight. But it also complicates the process as well. With the embedded waveguide, the AR glasses can’t be shared, and if the person changes their prescriptions, they much buy a new set of AR glasses. It also creates a logistical problem as each set of AR glasses becomes customized.

North on “All-Day Smart Glasses”

North has been making some “marketing noise” about their Gen 2 glasses. In particular, Stefan Alexander, VP Advanced R&D, spoke at Photonics West AR/VR/MR 2020 conference (free video of his talk) and in a two-part interview on The AR Show Podcast (part 1 and part 2). I don’t want to get in the merits or speculation on North’s Gen 2 glasses here, but instead, I want to focus on what North thinks it takes to be “All-Day Smart Glasses.” The Apple Glass that Prosser describes must violate many if not all of North’s criteria.

Pertinent to the discussion of Apple Glass is North’s explanation of what it takes to have smart glasses that can be worn all day. If the glasses can’t be worn instead of regular glasses, then not only will one look funny wearing them, but where will you keep them when not in use? Thad Starner, a fan of Norths Gen 1 professor at Georgia Tech and wearer of AR headsets for decades, made similar points in his 2019 AR/VR/MR presentation. Having AR glasses that can be worn continuously dramatically changes how they may be used. Having a camera on a cell phone means they can take pictures and videos anytime, so it would be different if the AR glasses are always available.

This image has an empty alt attribute; its file name is North-All-Day-Glasses-tabel-01.png

The table on the left is taken from Alexander’s AR/VR/MR conference presentation. These are extremely tough to meet set of requirements. The table may be a little self-dealing for North, but he makes some very valid points for glasses that can be comfortably worn all day.

Alexander also says that to meet North’s Gen 2 goals, they have do limit the FOV to less than 25 degrees, it would have a single display (monocular), and that AR content would be “semi-registered” to the real world without using full SLAM. Relative to North’s Gen 2, Prosser’s Apple Glass Leak suggests both Dual Displays and with the LiDAR, somewhat better registration to the real world.

I would like to now go through each of the key points made in Alexander’s table below. Plus I added a couple he left out.

Weight

Typical glasses are 20 to 35 grams, and 50 grams is pushing the limits of what can be worn supported by just the nose and ears for extended periods (a well known ergonomic fact). If the glasses are more than 50 grams, they start to become a “headset” that needs support from the rest of the person’s head.

For reference, Hololens Weight: 566 grams. Magic Leap’s headset weighs 325 grams and its Lightpack is 415 grams. Nreal claims it headset, not sure if this included the cable and likely without prescription inserts, weighs in at 88 grams. Importantly, it is not just the weight but how it balances on the head. Most AR glasses tend to be front-heavy which concentrates too much force on the nose.

Transparency

Ansi Z87 for eye protection defines “clear lenses shall have a luminous transmittance of not less than 85%,” so North may be a little strict with 95%. A piece of typical clear with no AR coating glass transmits less than 90%. With decent AR coatings, the transmission is typically is in the 92% range and North does not say if this is for their “combining element” alone. Many if not most AR/MR headsets today are less than 60% transmissive. It would seem to me that better than 85% is a more realistic target.

Seeing the User’s Eyes (My Addition)

While on the subject of transparency, a related issue (and not mentioned by Alexander) is the ability to see the user’s eyes and what they look like. Humans innately read other people’s eyes and will think people that block their eyes as being suspicious.

Much of the light that illuminated the eyes usually comes through the lenses when glasses are worn. Thus, the light must pass through the lenses twice in order for a person to see that person’s eyes. Thus with say 30% darkening lenses (ex. Nreal), the eyes appear to be about 9% (30%-squared) as bright as they would be without the glasses. This effect can be seen in the comparison on the left with >85% transmissive Lumus waveguides where you can clearly see the eyes versus ~30% Nreal where the eyes are barely visible and ~15% transmissive Magic Leap Ones where the eyes are almost totally blacked out.

Eye Box and Eye Illumination (My Addition)

Another human-factors issue is that the display tends to light up the eyes like it has spotlights on them. Simply put, the “eye box” is the range over which the eye can move and still see the whole image. To make a large eye box, which is a good thing from the wearer’s perspective, there ends up being light projected not just into the pupil, but all around the eye, just in case (see right).

The North Gen 1 had almost no eye box illuminate, which is good. Still, the tiny eye box means that even if the glasses are precisely aligned, such as with the North Gen 1 (even replicating the image four times), for the wearer, the image disappears readily and can be hard to find and see. Direct scanning lasers displays (without pupil replication) have the problem of having an even tinier eye box, virtually the size of a person’s pupil.

Doug Lanman, a director at Facebook Reality Labs, gave an interesting presentation at Photonics West AR/VR/MR 2020 conference (unfortunately, a video of his presentation is not available). He mentioned Facebook R&D experimenting with an eye-tracking steering mirror combined with a laser scanning projector to keep the image projecting through the pupil. I need to add that Facebook just bought exclusivity with Plessey for MicroLEDs after buying the MicroLED startup InifiLED in 2016. So it seems like Facebook is putting its big money on MicroLEDs.

Aside – “In AR, if you can think of it, Facebook R&D has tried it

Looking at the papers and patents that Facebook R&D has published in the last few years, I now have a saying, “In AR, if you can think of it, Facebook R&D has tried it.” While Apple garners most of the attention of the causal AR prognosticators, Facebook has been leaving a much bigger footprint.

I should also add that Lanman’s talk, while the most recent, is far from the first time I have heard of using a steering mirror to track a pupil to support a larger eyebox. I have also heard of companies trying to move the eyebox electronically. While electronically moving the eye box adds complexity, it has multiple advantages in addition to keeping the eye from being “lit up.” These advantages include reducing display power and supporting high resolution.

At the AR/VR/MR 2020 conference, Avegant’s CEO, Ed Tang, showed (free SPIE video here) tracking steering mirror for a foveated display (see here for the explanation of Foveated Display). In his concept, they used a “1080p chip” (most likely a DLP), but it could be a MicroLED, OLED, or laser beam scanning display. The same steering mirror technology is one way to support a more extensive eye box without illuminating the whole area around the eye.

Curved Lens Fit and Look Better

Alexander makes the point that the lenses need to curve to be something people would be willing to wear. Curve lenses require that waveguides or other optical combining structures be somehow embedded in the glasses. Maybe Apple Glass could get away with clever industrial design with a multi-part structure of a cover, waveguide, and outer protection cover like Digilens V1 (above).

Artifacts

This image has an empty alt attribute; its file name is ML1-R-Hololens-Diffraction-Comparison-002-771x1024.jpg

The most important thing for all-day wear is that when the glasses are “off,” they don’t cause vision problems. There can’t be artifacts or other issues that block the view of the real world, or they might be dangerous in some situations. As I discussed in AR/MR Combiners Part 2 – Hololens Magic Leap Review Part 1 – The Terrible View Through Diffraction Gratings, the view out through diffraction waveguides leaves something to be desired as the pictures below taken through the Magic Leap One and Hololens 1 demonstrates. The same “physics” that makes the light exit the waveguide toward the eye causes light in the real world also to be captured and sent toward the eye.

Eye Relief Less than 18mm

Alexander’s table shows >18 millimeters for “eye relief,” the distance to the lens from the eye. The eye relief or “vertex distance,” is typically 12 to 14mm with normal glasses. North seems to be saying that they think they can get away with “>18mm” without looking strange, but this seems a bit farther out than one would like. Alexander, in the presentation, suggests this was a hard spec for North’s Gen 2 to meet.

FOV and Content

This image has an empty alt attribute; its file name is North-possible-content-01.jpg

Stefan Alexander goes on in his paper to point out that to meet the requirements in the table, they had to accept that the FOV would be less than 25 degrees.

The images on the left taken from his presentation give an idea to typical contend with the North Gen 2. It would be classified as “data snacking” with about the same resolution, or perhaps a bit less than a smartwatch. The contrast would be worse since the background is whatever is in the real world. The image quality taken from Alexander’s presentation is better than is possible, but realistic in terms of the amount of potential content.

Side Note – North Showed Impossible Examples – Please Stop It!

This image has an empty alt attribute; its file name is North-projecting-black-pixels.jpg

One of my pet peeves is when companies show concepts that violate the laws of physics. Alexander’s slides included impossible content where they would have to project black. I think it undermines one’s credibility to show images that are not physically possible with the technology.

Conclusion

North’s presentation shows what a daunting task it is today to be “All Day Wearable” with even a simple monocular display. The laws of physics don’t change for Apple, and there are many major physics, cost, and human factors challenges.

With a $499 price and the form factor described, there is zero chance that Apple is making much and are would more likely be losing money on each Apple Glass unit. The only way Apple would sell something at cost will be if they have a plan to make it up somewhere else.

Prosser’s saying he didn’t see a camera would seem to put a big hole in the whole secondary revenue strategy. Either Prosser just saw a semi-capable prototype, or Apple had lost its collective mind. It would be like George Lucus giving away the toy rights to Star Wars. Even ignoring the secondary revenue, it would seem to cripple the product as people will expect to have cameras as they have on their phones.

Appendix – Connections with James Burke in 1978

I was an electrical engineer that for my first 20 years industry designed CPUs, graphics and image processors, and new memory architecture. Thanks to my working in projection-type displays the last 20 years and this blog for the previous 8.5 years, I have gotten exposure to a wide variety of microdisplays and AR optics (more on my background here). Rather than repeat marketing claims, I’m continually trying to take the available information and figure out how things could be built with an eye to whether it will be successful in the market.

Back in 1978, when I was working long hours on the TMS9918 (“Sprite Chip”), one of my first luxury items was a Sony Betamax SL-8600 VCR. That recorder retailed for $1395 in 1978 or about $9,500 in 2020 dollars when adjusted for inflation, so it was a luxury back then to an electrical engineer just out of school. The Connection’s TV series came out that year, and I recorded it. Each episode of Connections showed a history of some of the fundamental discoveries that led to a modern-day invention. While some of the connections are a bit tenuous, the series makes the point that credit for “the invention” is given to the person that has the last vital idea that makes the concept practical. Perhaps the best example in the series is of James Watt (as in power) “who didn’t invent the steam engine, he only invented a vital bit of it,” in Connection’s Episode 6 Thunder in the Skies. In this same episode at ~39 minutes, Burke talks about using Quinine in the 1800s to treat malaria. It seems that Quinine has been in the recent news 😊.

Karl Guttag
Karl Guttag
Articles: 243

33 Comments

  1. Excellent summary as always Karl. I meant to add something to your previous blog but will mention it here. With surface surface-gratings the out-coupled light (in wrong direction) cannot be reduced below about 30%, no magic-$bn’s in funding will change this. Kogelnik’s coefficients tells us volume holograms (as used by Digilens, and Truelife Optics Ltd) can enable much less diffracted light in the opposite direction, but holographic materials by their nature do suffer from scatter – so it a balancing act of how much pupil expansion (if using in a waveguide) you want vs amount of scatter (which degrades the image). A holographic free-path combiner on the other hand does not suffer with such successive degradation, but of course relies on a reasonable exit pupil from the relay optics.

    • I watched many of the presentations at AR/VR/MR but missed this one. You are correct, it is very pertinent and I wish I had seen it before. I try to keep up on everything but I missed this one. It looks similar to Tooz but uses a single curved mirror rather than a Fresnel mirror type combiner. I’m thinking about doing a follow-up article.

      • Hi Karl and frankenberry, thanks for mentioning our work! We intentionally gave up the peripheral resolution to get sharper central images. We called this ‘optically foveated’ in the paper. For example, we could achieve either: 16 cpd uniform resolution over the FOV (40×20 degrees) or 30 cpd at the center and 5 cpd at the periphery. Please check Fig. 6(a) in our paper. We thought this is the best optimization in the given pixel numbers (1080p), and it wasn’t as disturbing as Fig. 11(a) when you actually looked the demo with your eyes due to the foveation. But if you think the uniform optimization is better, we can make a solid, uniform 16 cpd resolution display by just changing the freeform image combiner shape.

        Please check our another experimental video of our wearable prototype. https://www.youtube.com/watch?v=UEuJzio1w-I

        If you have any questions, please feel free to send me an email. Thanks!

  2. Love the AR content, but I just want to comment that Connections was absolutely the best show ever, and I’m always sad that more people don’t know about it.

  3. Poser or Prosser ?

    I get a hint, this guy has no source other than the public domain. His product outline seems to follow a checklist of weaknesses of al other products, combined with Apple patent filings, combined with known Apple patterns. Ever since the iPod we learned how Apple designs are more consumer friendly device, that already existed, floods the market and becomes seen as the inventor – time and time again.
    Excluding a camera, typical Apple privacy talk. Material and shape choices, petty predictable. iPhone pairing, logical, predictable, sensible – anybody that worked on AR glasses in the past 10 years came to the same solution when it comes down to compute unit…

    There is zero novelty in his claims, that somebody with a little bit industry knowledge could not put into a prediction. Cooked at Apples preferred temperature, served with slick design.

    • Proser seems to have a pretty good reputation for his leaks and is not just making things up. According to Apple Insider:

      Jon Prosser is an avid Apple leaker with sources throughout the company and supply chain. His predictions based on leaked information became prominent in early 2020 as he nailed down details of the new iPhone SE weeks before launch. The often moving targets of release dates and prices are hit and miss, but when it comes to product details, Prosser has been very accurate so far.

      • And what is so revealing so far ? That glasses come with two lenses for both eyes ?
        Or that they will make a display for each eye ? Or that I might come late this year, or next year ?
        The shape of the same etc. was depicted in abstract ways in the patent filings last year.

        The only thing that is tangible as far as I can tell is the target price.

      • My big takeaway, assuming it is true is the combination of:
        1. Dual displays
        2. $499 price
        3. Can’t tell the device is on (it is possible that he was just not very observant). It is not clear whether this means that the display is transparent and you can see the user’s eyes.
        4. No sunglasses – sort of suggests that the display is transparent.
        5. Sooner rather than later (this was the biggest claim/leak)

        He also indicated that the resolution and/or FOV was relatively modest.

        There is enough information when combined with intelligent technology assumptions to get some ideas as to what Apple might be doing.

        It does not seem that Prosser understands the underlying AR technology, or we would have much more information to work with. If an “expert” had a set of glasses in their hand (or even some good pictures and/or videos), they could tell a lot about the technology being used.

      • Sure, if you look at some other tech Apple has developed in the past years, then the pieces begin to fall together. Like the structured light sensor in the iPhone Face ID or ultra compact Lidar of the iPad.

        Look at the XDR display optional nano coating that disperses glare, that really works extremely well without distorting the image.

        Cumpute power in new iPhones or iPads rival many desktops.

        Dual display is a no brained for me, just from a marketing point of over google glass.

        Apple never makes the best device, but the best marketable, the most user friendly for best user experience. So don’t expect light field display, but daylight capable HUD.

        Then the price point starts to fall into place too…

  4. Hi Karl, Just found your blog via a YouTube video. I became a VR developer a couple years ago after trying a Vive and seeing Keiichi’s Hyperreality video, consumer AR, if it ever arrives, is the platform I’d love to develop for. Although I can’t understand all the optics analysis, I am still thankful for your thoughts on AR. So much better than watching 5 videos from people who regurgitate information. Cheers.

  5. as far as the price/apple margins problem …..anything is possible
    like it’s 500 if u get a bundle of iphone apple watch and apple glass …but without that it’s 750
    or it’s 500 if u get the apple card too …..or if u pay for it with the apple card
    and there r tons of other possibilities like return an iphone x or newer and get the glass for 500 instead of 750

  6. Hello Mr Guttag
    It seems plausible that the Apple AR glasses will have an RGB camera but not allow the user access to the camera output. This the camera would work with QR codes etc whilst not stirring privacy concerns. Not only are Apple good at locking down their hardware, but they are also widely *perceived* by the public to be good at locking down their hardware (see FBI requests for iCloud accounts, the T1 chip in MacBooks that runs a verified secure micro kernal and acts as a gatekeeper to the webcam and microphone).
    This isn’t just an Apple thing – some Sony camera sensors place memory cells directly behinds the sensor pixels as a buffer to facilitate burst shooting. Sony are now prototyping sensors incorporating facial recognition silicon in the same way ( initially aimed at retail security cameras). Both of these things are done to save power, according to Sony.
    If power is a concern (which in AR glasses it surely is) then why have a camera module output an image that must be processed elsewhere? If all you want is a text string from a QR code, then a camera module that can decode QR codes itself and then only outputs the resulting text string is the most power efficient way of doing it. Apple could then say, hand in heart, that their glasses can’t capture an image.
    Just a thought! Keep up the good work

  7. While the basic models might start at £499, I expect they’ll make tons of money on “fashion” frames. Look at what people are prepared to pay for watch straps…

    As others have said, I expect a camera that only Apple can use – developers and even users won’t have access. Look forward to seeing it jailbroken!

    I expect their vr (Apple quest) hmd will never see the light of day. If it does, it’ll be like Apple TV, homepod, etc an expensive and second rate experience.

    • I can’s see where fashion frames alone will satisfy Apple if they are selling a product without their expected margins.

      Apple may have overthought the issues with respect to privacy, but they might as well take the cameras and GPS out of the iPhone if they are really serious. There are too many revenue sources and application uses for a camera. They can’t have a high volume consumer product based on jailbreaking. It wouldn’t be the first time a product got screwed up by over-thinking the issues.

      • It is now the social norm to have a phone with a camera. Over the last fifteen years, some workplaces that once banned cameras are now relaxed (or resigned) about them.
        Has our society changed much since the Google Glass backlash? I don’t know, but certainly topics like mass facial recognition from CCTV, and state surveillance in general, have been discussed more.
        Apple have recently updated Siri so that it responds to the voice command “Hey Siri, I’m being pulled over” by starting to record video and audio. This at a time when members of the public are urging each other to video all police interactions with the public. Could it be that the general public will be more tolerant of glasses-with-cameras today, if they a, accept that their privacy has already been compromised by CCTV etc, and b, video footage from fellow citizens can be a defensive and not just an oppressive tool? I don’t know.
        Anyway, regardless of the above, the idea of a camera module that incorporates an Apple-designed image processor that outputs different classes of data such [alphanumeric string], [low res monochrome image], [full res] [video] depending upon a permissions system just feels like a very Apple thing to do. Even if public perception of a camera was not an issue, it seems a very Apple thing to do to purely for minimising power consumption.
        It is a very Apple to deny users access to hardware (the cellular hardware in a 4G iPad can be used to make emergency telephone calls, but only emergency calls. It’s very Apple to not implement hardware if the power consumption is too great, such as 3G cellular in the first iPhone. It’s very Apple to be conscious of public perception – which they’ve never been keen on Macs being used for video games.

      • The Apple Glasses might extend the iPhone and work with it in combination. So Scanning a QR-Code will be done with the iPhone as well as typing text/reading text, taking pictures and so on. The iPhone might also be used as a controller (a bit like a StarTrek Tricorder). Actually the Apple Glasses are connected to the iPhone and it has a lot onboard like the cameras, a LiDar a touchscreen, sensors etc. The moment you scan a QR-Code with the iPhone it will get a position in 3D-Space with its own sensors. I guess thinking about a user wearing the glasses and holding the iPhone in the hand could also be used to track the eyes with the front camera and do face recognition and so on. Thinking about what possibilities result from iPhone + Glasses might be taken into consideration. I guess in this scenario you will do what you do with a smartphone and do only a few little things with the Glasses as extra functionality. The LiDar in the Glasses could be used to track hands (or one hand if you hold the phone with the other) even as a second scanner without showing a picture in the Glasses. Just a thought. Would explain also the price and if it is true that they work together with valve, they might get some remarkable applications to show.

  8. Hi Karl,

    I was wondering if you saw the new paper from Facebook about their “holographic lenses”. (Linked below) They are impressively thin and use a laser engine for their light source, and I dont quite understand how they work.

    I was wondering if you had any idea how they pulled this off. The mention of “holographic” lenses and a tiny eyebox made me think of the Intel Vaunt/North Focals (who got bought by google today) approach.

    https://research.fb.com/blog/2020/06/holographic-optics-for-thin-and-lightweight-virtual-reality/

    Hope you are doing well.

    • Yes, I saw that Google bought North. I was busy doing the Hololens 2 display evaluation.

      I was interested in what North was going to do for their V2. As was wondering if they were going the route of pupil replication and a waveguide ala Hololens, only smaller. Or were they going to keep with direct laser scanning with some minor pupil replication? From North said about the V2, it was still going to have a small FOV and then improve on the resolution/image quality.

      North is what I would classify as “data snacking AR.” The content is about the same as a smartwatch. The question becomes how many people will want this use-case when the cost of entry includes getting expensive custom glasses made.

      It’s easy to read too much into Google buying them as it was just petty-cash to Google.

  9. Hi, Karl

    Best analysis ever for AR devices. I’m recently working on China Stock market analysis for AR markets, as your analysis I don’t see any high tech actually developed by Chinese company. All the cutting-edge optics technologies are in US and Israel. The only companies worth to invest in China are the manual factories which can provide massive and quality spare parts for Microsoft and Apple. Can’t wait for more articles.

    • Thanks.

      As for the 9to5 Mac, you have to be careful about what they and Patently Apple writes about Apple patents. Only a very small fraction of the patents you see filed ever get used. In the case of the specific patent and using eyes as an input device, it can be done as Hololens does it. It has a problem with precision without some feedback (such as a dot in the FOV). There is a huge amount of leverage/error in trying to figure out where the eyes are exactly pointing. Using blinking to control “select” is going to lead to a lot of false positives. Input is one of the big issues with AR, particularly with trying to keep the hands free. Many companies are working on wrist nerve/muscle detectors to detect finger movement/actions, but I don’t know how well these work beyond demo situations.

    • The speaker was Tom Furness who it sounds like consulted with Magic Leap. Furness is most associated with the U of Washington HIT lab. The video is a short quote and he may not have spoken correctly.

      With lasers and phased arrays, you can, in effect, cancel out the laser light that has a lock phase relationship (this is how holograms work), but it would not work to “project black” to stop randomized light in the real world.

  10. Dear Karl,

    If I have a chance to ask some questions with regard to Apple Glasses (to someone who’s actually seen it in person), what are some intelligent questions that you think I should ask?

Leave a Reply

%d bloggers like this: