304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
It seems like every day I get ready to finish part 1 of my MicroLED series, another AR news story breaks. And not just any AR story but one that Apple may be announcing their AR glasses soon along with a very few details.
I’m not an avid Apple follower, nor had I heard of the leaker, Jon Prosser, before today. But as just about every on-line tech and business magazine picked up the story and said Prosser past leaks usually came true, it gives the leak an air of credibility. For the sake of this article, I am going to assume Prosser is correct in his statements and then try to figure out what Apple could be doing based on the information.
This blog is not about breaking news, but more about analysis of existing information. Google Glass was old news when this blog figured out that it was using a Himax LCOS Panel. Magic Leap had already raised over $1B and had hundreds of patents when this blog correctly figured out what they were doing about a year and a half before they shipped a product.
I don’t have enough to go on today to know what Apple is doing precisely. As I wrote back in December, Apple has not left much of a patent paper trail. BTW, I find that Patently Apple, at least in the case of AR, tends to over sensationalize what appears to me to be either trivial patents or things that I doubt Apple will bring to market. Either Apple is not filing significant patents, filing them under the name of a shell company, or they are buying the technology from someone else.
This is another, I want to get this out in a hurry article. Expect more than the usual number of typo’s. There will probably be some corrections in the upcoming days.
Today’s big news is that Front Page Tech’s (FPT) Youtuber Jon Prosser released a video giving information on the Apple Glass name, planned announcement date, price, and some minimal technical information on the product itself. I’m going to try and tease out some additional technical details from the scant information given.
In the figure on the left, I have combined the video’s essential information from three slides Prosser presented. I should add that Prosser stated:
I’m trying to get the green light legally to share a video of a prototype of these glasses that I’ve seen.
So maybe we will see a video soon, but it was not clear to me if it will be a working prototype or just a mockup. While Prosser may be a tech enthusiast and good at finding leaks, he may not know what to look for in an AR headset.
An AR expert with just one look of a functioning headset could tell a lot about the optics they are using. And within a minute or two of using it could figure out what display technology they are using. Listening to Prosser’s video several times, I’m not sure he has looked through a working unit. Maybe he has seen a working unit, but I would expect to know a lot more from someone that has seen a working unit.
For the rest of this article, I am going to go through the bullet points from the video, where I feel I have something to add. I’m going to try and do some “thought experiment reverse-engineering” to tease out information from between the lines of what Prosser wrote.
No surprise and nothing original, but it looks like we have a name. iGlass and iGlasses like iWatch were already taken.
Interestingly this is the exact same price as Nreal’s AR glasses that plugs in smartphone via USB-C cable for the processing. While Prosser never says so explicitly, almost certainly the Apple Glasses have wireless data (more wireless data later), and probably why Prosser didn’t bother to mention it.
The $499 price gives a lot of clues. Figure that if the Nreal headset were in the Apple store, it would likely sell for more than $1,000 with typical Apple margins. To meet typical Apple margins, Apple Glasses would need to have a bill of materials (BOM) about half that of Nreal’s consumer glasses. And Apple has wireless battery charging, a battery, and wireless communication that Nreal does not.
Apple is going to want much sleeker and inherently more expensive optics than Nreal. The Nreal birdbath optics (essentially the same as the OGG optics on the right – see here for more about how they work), while inexpensive with excellent image quality, are too bulky and probably block too much light for Apple’s liking. The thinner optics are likely not going to have nearly as good an image as a simple birdbath.
So Apple Glass has to support higher margins and inherently more expensive optics but sell at the same price. Something has to give, and it is likely in terms of resolution, field of view (FOV), and image quality when compared to Nreal. It suggests that Apple Glass is going to be somewhere between Google Glass and Nreal when it comes to the image the user sees. I’m expecting the display resolution to be between 400 x300 and 800×600 pixels in resolution. Everything to me suggests that Apple is going for “style” over image quality.
I tend to doubt that Apple will use waveguides like Hololens (1&2), Magic Leap, WaveOptics, Dispelix, and Vuzix. Given the available information, I would favor some form of thin freeform optics like what has been done by Tooz or by the lesser know startup, Oorym. Tooz spun out of Carl Zeiss that back in 2017 was rumored to be working with Apple. I’m not saying that it was either Tooz or Oorym specifically, but that the optic might be something similar to what these companies are doing. Freeform optics are going to be much more optically efficient than say diffractive waveguides, which translates into less power, which saves on heat and battery. Freeform optics are bigger for a given resolution, but if Apple’s resolution and FOV goals are modest (i.e. small), freeform optics would seem to be the best option.
Then we have the comment on cost about being “plus prescription.” This immediately says that Apple’s Glasses are not going to have the large eye relief of say Hololens that would let someone wear their own glasses. The cheap but ugly way, not the “Apple way, would be to add prescription inserts like Nreal and Magic Leap, as examples. By process of elimination, this leaves building the prescriptions into the optics. Once again, this tends to point to thin freeform optics. Tooz, for instance, supports building the prescription into their optics, and Oorym claims they will be able to support prescription lenses.
I don’t have a much to go on in terms of the display type. I’m leaning to some form of flat panel, either a small OLED or a tiny pixel size (~20-micron pixel pitch) LCD. It could also be something like a Frontlit Himax LCOS device. When you are talking a modest resolution and FOV, there are a lot of options.
Quoting the video (from auto-transcript with minor corrections and punctuation):
All data processing is going to happen on your iPhone, so similar to what we saw with the first gen Apple watch. That’s sort of what we can expect here with Apple Glass. Some of you might be surprised and bummed by that. But it makes sense. I know a lot of people are like expecting some sort of crazy holograph like Hololens type experience. Some life-changing device, so I want to sort of temper your expectations here, especially for the first generation product. But listen, it makes sense. Data processing happening, and depending on the iPhone in your pocket, that price point of $499 makes a lot more sense now. Anything more than that would probably increase the price of these things quite a bit.
Everything said above makes sense. Apple Glass is likely to be closer to an Apple Watch (448 by 368 pixels) in terms of display resolution than say an iPhone (1792 x 828 pixels). Apple Glass is likely to be more of a “data snacking” and for playing lower resolution Pokemon-Go-like games. So they can more than get away with using the phone’s processor.
The “look like glasses” tends to support the idea above that the prescription is built into the glasses. Supporting prescriptions probably works against the many “uni-lens” concepts I have seen. It would be a developmental and logistical nightmare to have a “uni-lens” with prescriptions.
Despite all the “shaming” and privacy concerns, I can’t see this type of product being successful without the ability to take pictures. I can believe Apple would do it just like they wouldn’t make a large iPhone for so many years (sometimes Apple outthinks itself). If they make it past Gen 1, then expect a camera on Gen 2.
Putting the LiDAR on the right temple and would seem to be worse for left-handed people (about 10% of the population). Also, gesture input can be very problematic in real-world situations.
As mentioned above, not only does this say there is a battery; it also implies that there is wireless communication to the iPhone. Like the Apple Watch, it will probably use a combination of Bluetooth and WiFi. But wireless communication takes a lot of power per bit, so sending video tends to be power-hungry. I would not be expecting to watch movies on Apple Glasses. Maybe a low-resolution video clip occasionally but not a lot more.
It sounds so simple to put “displays in both lenses” until you realize the ramifications. It was this one statement that made me want to write this article. It may sound like supporting two eyes is just twice as expensive with a display for two eyes. But it is both more expensive and much more complicated because you have to deal with getting the two images in the right place for stereo vision.
Consider the variability in human interpupillary distances (IPDs). See, for example, table 2 linked to here. To cover the 1% percentile of adult women to 99% percentile, men require a range of 50 to 75mm. Dividing by 2 for the two eyes, and they need about 12.5m mm of adjustment. With 95% percental of women and men (57mm to 71mm), there needs to be ~7mm of adjustment per eye.
What a person perceives at depth/distance with stereo vision is a function of the difference in the image between the two eyes. If two people have different IPDs, then the location of the images in the optics has to be based on their eye locations or they will perceive the image to be at different distances or in not see the image at all. Dealing with different in IPD has a few options:
The fundamental problems are that if you are saving cost on making the optics to meet a price point, then supporting a large eye box to support a wide range of electronic IPD adjustment is not possible. Pick your poison, #1 is ugly and unstable, and very “non-Apple,” #2 is too big and expensive, and #3 is a logistics nightmare. I would guess they would have to pick #1.
If I had one question I would like to see answered; it would be “how is Apple Glass going to support dual displays for people with different IPD?”
I give, this makes no sense unless they are just talking a mechanical issue, or they are talking about polarized sunglasses that can cause issues with some optics. Simple neutral gray sunglasses should work with any display. I think there is some information missing in Prosser’s statement.
I know I’m going to get the usual “it’s Apple” and “Moores Law” (see Saying “Moore’s Law or Apple” Does Not Make AR a Consumer Product). What I tried to do above is take the information and make it fit within the known bounds of physics and economics. It looks to me that to get to a $499 price point with dual displays, Apple is would have to do some extremely non-Apple like things in terms of quality and margins.
You can figure out a lot by just looking at an AR headset. How dark are the lenses? If it looks like Nreal, then it has a birdbath combiner. You can often see the size of the eye box. You can see how the vision correction is supported (say with an insert) and how it will support IPD adjustment. All AR optics will have telltale signs of how the display image is combined with the real world. Diffraction gratings, for example, will show up as a rainbow of colors. Embedded mirrors can be seen if you look from the right angle.
I will usually take turning the device on to figure out the type of display. Field sequential color indicative of LCOS or DLP will be seen if the eyes move or you move the headset. You can generally tell from the content, the rough resolution, and FOV. If you see a screen-door effect, then you are probably looking at a display with large pixel pitches. The contrast of the display can also be another clue; OLED displays usually have higher contrast.
Then, of course, if you can ask the right people, it comes down to asking the right questions. Hopefully, this article gives the reader some clues at to what to ask.
IPD is a massive problem. I’ve researched VR “motion sickness” and found that half or more of the problem is Eye Strain. I, for example, have a non-standard IPD, which makes the thought of most headsets terrifying.
So, when I saw North/Focals solution, I knew it was correct. They have a mobile-fitting station inside an RV (or small bus) that they take around. Obviously, that is a nightmare, but it isn’t a problem for Apple. They already have the Logistical Solution: Apple Stores. The kind of people who will buy Apple Glasses are also the types to visit these stores. And it makes sense (like regular glasses) to do it this way. In the future, you can have 3rd party or online methods to fix the IPD (there are many regular glasses/optics companies working on solutions like this today). If you give people a headache, they’ll sue you. So you go with the safest solution: Fixed and custom IPD.
All that said – If you aren’t looking for Stereo solutions to be perfect (like a HUD), then some of those requirements can be more lenient. (But I could be wrong about that)
I don’t think even Apple with its stores is up to customized IPD combined with prescriptions. It seems that mostly what they do at the Apple stores these days is either replace the battery, the cracked screen or tell the customers that it is cheaper to buy the new model. Reviews of the Apple store are not what they used to be. Trying to custom fit for IPD is a lot of work for a $499 product. But a $499 price does not seem to leave room to have the FOV and eyebox overkill to support much (there has to be some) electronic IPD adjustment.
I can’t get everything to add up: Apple with its margins; Apple’s desire for stylishness that seems to rule out cheap, ugly, and problematic; dual displays with their cost and IPD issues; and a $499 price. It makes me wonder if Apple is trying to punk Prosser.
Apple does have the ability to do extremely precise mapping of their users’ faces and eye positions using FaceID. It wouldn’t be inconceivable that you could order this new product through an iOS app, especially as it relies on a paired iOS device.
Custom manufacturing would then be the concern, but if anyone could do it, Apple can. Think of when they bought thousands of CNC’s to make the unibody make, something unheard of in manufacturing before they did.
Also, isn’t it interesting that almost everyone who presents at Apple keynotes always wear glasses instead of having contacts or corrective surgery? Seems natural that they’d do a one more thing and have us realize they were wearing them all along. Especially with a COVID release with no real audience they can more easily control it being hidden. Let’s wait and see.
I could maybe see customization for a very high-end model (rumored “Steve Jobs edition”). But I think it would be a mess for a volume product.
One basic problem is that people’s vision changes. It would be much better to replace the insert than have to replace the whole unit.
I think it is more likely that Apple will think of how with industrial design to make the inserts look built-in
Great summary Karl – I agree freeform optics most likely option – although if they’ve cracked microLEDs and have high enough luminance (there are some around 1M nits now) they could use them to project in a similar way to a laser projector on North Focals, and partially get around some of the eyebox – IPD issues, though no doubt still have different frame sizes to help deal with it.
Regarding cameras: in some of my recent research one of the main use of cameras on smartphones is to take selfies – for this reason (and a good few others) I’ve doubted the claims by some that AR glasses will replace smartphones, so perhaps the requirement for it is not as high on the price-cost benefit (cost also being power consumption). That said, there might be some utility in having a camera up there on the glasses for some tasks.
Their nod to privacy on cameras, if genuine (and it was a big issue for Google glass), would certainly rule out 2D diffractive waveguides due to the ~50% diffracted order coming out of the front, so people can view what you have on your display, and also make you look like a cyborg – about as appealing to the average consumer as the idea of wearing glasses at all.
Regarding pricing – I’m sceptical they will be making this hardware in the hope to make money from it – especially if you factor in all the NRE and acquisitions to get to this point. I imagine they will be relying on their exceedingly well managed app store to make money, but thats the killer question, after all is said and done – what is the application? Repeating functionality of a smartwatch, but without the convenience? (try wearing glasses whilst running, and how will it monitor hear rate?!). Still, they can well afford to have a go and not worry if its remains a niche product. Not too many people wear their watches after all.
Thank you, and can confirm that Prosser has been extremely accurate in prior leaks.
What is your initial guess on the display LCOS or OLED? No chance it could be MicroLed yet I guess? Other rumors were that an Apple VR visor gaing-type product would precede these glasses by at least a year – that does not seem to jive with the timleines any more based on the Prosser detail.
As I responded to Ze’ev, I can’t get everything to add up: Apple with its margins; Apple’s desire for stylishness that seems to rule out cheap, ugly, and problematic; dual displays with their cost and IPD issues; and a $499 price.
It makes me wonder if Apple is finally trying to get to Prosser and is punking him. Apple’s market cap is ~1.38 Trillion, which can buy a lot of tricks to swat at someone annoying them. Eventually, someone will slip up and the source will get burned if he keeps going back to the well.
I don’t have enough to go on to figure out the display type. There is a chance it is some kind of high-resolution backlit LCD. If they use freeform optics, then it could be an OLED. I could make an argument for a lot of things. If they are using a waveguide, then it almost has to be an LCOS display.
Ha yes, one can conceive a number of ways, and after all is said and done, whats to say this is more than just hot air. After all its made no difference on the shareprice, unlike the rumours for MVIS, so this is not the big story that a small group of AR/VR players think it is.
More likely is that Apple uses the Bosch MEMS projection solution for smartglasses. That is sufficient for the use case and allows “real glasses”. Bosch is an Apple partner for many years.
This is a cheap solution (likely between $10-$50 for the Bosch components) with low battery usage that allows both: High margins for Apple and long use time for users. Apple focuses always on use time and not on maximizing the resolution. The glasses must work one day otherwise they will fail. Also they must be robust. Nreal approach is too complicated and glasses need to be set up for ever person individually in a complex process. Too complicated for the mass market and not possible in Apple stores.
You are totally ignoring the massive problems with laser beam scanning. The “cheap” Bosch/North-Focals solution with direct retina scanning off a hologram mirror ends up with a tiny eye-box and extremely low (about 200×100 pixel) resolution. See also my prior response.
I seriously doubt Apple is using laser scanning. Bosch was using direct retinal projection which has near-zero eyebox. This is why Bosch and North Focals required the custom fitting of each headset even for monocular (single eye). Trying to fit for two eyes would be next to impossible.
To get around the “near-zero eye box,” you have to resort to pupil expansion (see https://www.kguttag.com/wp-content/uploads/2019/02/Microsoft-LBS-pupil-expander.png). This requires a lot of expensive optics, something Microsoft could afford with the $3500+ HL2.
I think now I have a pretty good idea of what Apple Glass is using for the display and it is not laser beam scanning. Stay tuned.
Karl has suggested that Tooz, a joint venture by Carl Zeiss and Deutsche Telekom, might be a candidate for the Apple Glass optics choice .
In his SPIE 2020 talk Kai Stroeder https://www.linkedin.com/in/kaistroeder/ had some interesting comments –
“Technology, that’s a hard one because customers get nervous at some certain point. We at Carl Zeiss, we have tons of applications for lasers. But most of them are to repair or remove tissue to help you if you have got really a medical issue. We want to avoid that people get medical issues wearing glasses. ”
Also on the Tooz website https://www.tooztech.com you can find a No Laser logo – https://imgur.com/L1zL79o
I find it interesting that one of the worlds oldest and most well respected optical companies would be so adverse to laser use in NED’s .
In looking into Apple Glass some more, I’m pretty sure that there is a different technology used in Apple Glass. I will be getting an update later today.
One of the clues from the Prosser video that you didn’t mention was that you can’t see the image from the outside . The image in the Zeiss Smart Optic (Tooz) also can’t be seen from the outside .
Prosser Video –
“you couldn’t tell that the lenses were displaying anything only if you are the wearer could you actually see what’s being displayed”
Zeiss Smart Optics Hands-On | Pocketnow
“but if the wearer focuses away from the display it disappears”
Also “unseen by others” –
With the ZEISS Smart Data Glasses, similar to a head-up display, a ‘flying screen’ appears in front of your eyes that can display relevant information and also videos, but unseen by others.
I’m not aware of any other optic that does this . Perhaps you do .
I’m sorry to disappoint you but, I look at the information and call it as I see it.
I am putting together some very strong evidence that they are using the Akonia Holographic Waveguide with an LCOS Microdisplay. There was a lot of material to go through and it is taking some time. I hope to have it up tonight but it may take until tomorrow.
Re: no forward projection (seeing the image from the front) – This can be done with the right hologram I am told by several experts in the field.
Re: “but if the wearer focuses away from the display it disappears” – this is a function of what is known as the “eyebox.” Normally a larger eyebox is better as it makes it easier to see an image. Too small an eye box means you may not see the image if the glasses or your eye shifts slightly. Making a small eye box is easy, making a larger one is harder.
Re: Re: “but if the wearer focuses away from the display it disappears” – this is a function of what is known as the “eyebox.” Normally a larger eyebox is better as it makes it easier to see an image. Too small an eye box means you may not see the image if the glasses or your eye shifts slightly. Making a small eye box is easy, making a larger one is harder.
Yes , that’s why I followed up with clarification of the video from another source –
See “unseen by others” –
With the ZEISS Smart Data Glasses, similar to a head-up display, a ‘flying screen’ appears in front of your eyes that can display relevant information and also videos, but unseen by others.
Any combiner can avoid the display image leaking out the front, as long as its NOT a pupil replicating waveguide! Take North focals for instance – because the beam is projected from an almost point source to the glass combiner surface, it is thus diverging. All anybody else sees is a bright spot from the laser projector if they happen to be looking in the right direction (not at the users eyes, its off-axis, so uses a hologram to steer).
Backlit LCOS is lossy, but much easier to get around head-motion artefacts than micro-OLED displays (due to persistance) – the power required for tracking and correction might end up overshadowing the >50% light loss of LCOS – especially if using tailored microOLED panel for illumination with minimised cone angle (besides polarisation, ettendue rears its ugly head on LED collimation, losing another >~30%).
Karl, yes that correct in principle – volume holograms, unlike conventional diffraction gratings and surface holograms (both essentially the same thing) can allow for most light to be diffracted in one order (ie to the user), however volume medium have scatter characteristics and still no such thing as 100% in one order, so get some image leakage out the front. Obviously Diglens have been pushing similar volume holograms for years, but in liquid crystal. By reducing FOV, perhaps it might avoid too much of the rainbow and creating too many scatter-related artefacts in a 2D waveguide, but image quality and effeciency will still never be as good as a combiner. That said you have an impeccable track record in your observations, and there are no end of people making waveguides telling us they are the future.
^Regarding the future, of course in the past some people already went from diffractive and holographic waveguides to reflective waveguides and combiners. ODG, Sony (appear to have ditched their holographic waveguides for reflective based solutions) and let’s not forget the master of AR optics – Yaakov Amitai 😉
Since years Apple is investing in microLED tech. They even have a small test fab near their HQ.
Most analysts thought that it would replace LCD and OLED in their most selling products.
But with Apple Glass microLED makes much more sense. Small pitch, high lumen output.
I think I know what Apple is doing and I will be publishing a major update today. I think it is too soon for MicroLEDs for AR.
I don’t think full-color MicroLED MicroDisplays (more on that below) needed for consumer AR are ready. I would expect that most of Apple’s efforts on MicroLEDs are going into direct view displays for watches and then iPhones. Watches are thought to be the first market as they could share technology with phone size displays (with laser annealed polysilicon transistors). The “MicroLED” company that Apple bought was working on technology aimed at direct view smartwatches and smartphones and too big for AR. It is commonly thought that smartwatches will be the “test market” for smartphone MicroLEDs. A smartwatch has about 100,000 pixels where a high-end smartphone has over 2 million or more than 20X more.
When people say “MicroLED” they are often talking about 4 very different technologies. There are “wall size,” television size, watch and cell phone size, and Microdisplay. From a draft on an article I am writing:
The term MicroLED (and uLED) is being used to describe any type of display that uses “tiny” inorganic LEDs, as distinguished from OLED displays. Large wall displays, home television, cell phones, and MicroDisplays, are all being called “MicroLED displays.” A 4K 55-inch TV has a pixel size of ~325-microns, an iPhone 11 pixel is ~55-microns, and the pixels for AR are in the 3- to 10-micron range. There are massive differences in the technologies and processes between each of these display sizes. With wall, TV, and cell phone displays (also used in VR), they will be slicing up LEDs made on semiconductor wafers and transferring individual LEDs to a glass or plastic substrate covered with thin-film transistors. The very large displays will singulate R, G, and B, LEDs. TV’s are likely to singulate blue LEDs and then use either Quantum Dots or micro-Phospors to cover to Red and Green. Smartwatches and Smartphones might use Quantum Dot conversion of blue or singulated R, G, and B on a laser annealed silicon transistor. But for the MicroDisplays needed to make small see-through AR headsets, most are planning to flip-chip the LEDs for a full display onto a CMOS. Color is a massive challenge for MicroLEDs as Quantum Dot and Phosphors won’t work (too thick a layer relative to the pixel size). The best process of making even a cell-phone or watch-sized “MicroLED” display will be completely different from a Microdisplay MicroLED. By far, most of the effort and most of the companies working on “MicroLEDs” is going into the larger displays for TVs, cell-phones, and watches.
Karl, there are a growing number of companies now able to do native green and red as well as blue µLEDs, although given Apple is only just starting to make its own CPUs, you could be right in saying they might have a way to go with full driving architecture for their LuxVue µLEDs (yes it seems this term gets used in many places, just like the “hologram” these days).
Sony unveiled a small µOLED last year at DisplayWeek – 640×360 at around 5kNits output, that would be sufficient for relaying to a combiner. Add a photochromic coating and you’re not competing against the bright Cupertino sunlight 😉
I can’t imagine it would be a proper ar headset like nreal with a cable and draining your phone battery ….unless it comes out as a dev kit which is also unlikely
I expect an equivalent to a google glass but with better dual lenses and just bluetooth phone connection and limited tracking and qr codes…that way it will be small and shiny and not have much battery life problems and u can sell it to lots of average consumers
a cool move would be if it has an optional wired connection to an iphone making it equivalent to nreal in that mode
I think I have it better figured out and will be coming out with an update today.
I don’t think it is wired as it is supposed to support wireless charging. A cord would be considered ugly and a snag hazard by Apple. It also creates a storage problem as it would be a fairly long and somewhat thick cord.
May all be rumor –
In response to this week’s deluge of “Apple Glass” rumors, Bloomberg’s Mark Gurman in a series of tweets on Thursday called Prosser’s claims “fiction.”
According to Gurman, Apple plans to announce a “mixed AR and VR” headset, dubbed “N301,” as early as 2021 ahead of a release in 2022. A subsequent and presumably more advanced “pure AR” device referred to internally as “N421” will launch in 2022 or 2023, Gurman said. Word of Apple’s head-mounted hardware strategy — the launch of a basic model followed by a more advanced specification — first surfaced in a report from The Information last year.
Thanks for the info. It could be a rumor or even Apple trying to punk Prosser (you have to believe sooner or later are >1 trillion dollar market cap company is going to get him).
Even based on what it seems like Apple is likely working on, it takes some best-case assumptions on manufacturing cost to get to a $499 Apple price with Apple-like margins.
Yes I’ve not seen any indication from Bloomberg that this is anything other than noise – the main preoccupation from the markets and on Bloomberg West yesterday is Apple’s software and streaming services, competition from Spotify, Netflix etc. Nothing at all about hardware, least of of AR.
[…] in haste, repent in leisure,” they say. In my first article on FPT’s Jon Prosser’s video leak giving information on Apple Glass, I didn’t discuss what just […]
[…] Karl Guttag proposes that the Prosser leak points to thin freeform optics instead of waveform for how images are displayed and its implications for the prescription requirements: […]
[…] I discussed at length in Part 1, Prosser’s “Displays in BOTH lenses” seems to be the hardest part to fathom. Dual […]
[…] think it is sleek enough for Apple. At the same time, as I discussed in a three-part series (Part 1, Part 2, and Part 3) last year, Micro-OLEDs are not bright enough to support most thin […]