Hololens 2 (HL2): “Scan Lines” Making Text Hard to Read and Quality Issues with Waveguides

Update (Dec 4th, 2019) Post Referred To In Original Article Deleted – News Control:

Within 12 hours of the original article being posted, the posts/threads on Facebook’s Hololens Developer’s Group that I linked to in the original article have been deleted. I don’t know why, but it could be voluntary (or pressured applied). Fortunately, I captured the posts before they were deleted and my captures of the posts are still linked to in the article.

The deletion of the thread either voluntarily or by “influence” shows why there is only “happy talk” when it comes to “reviews” of the Hololens 2 (HL2). The people that are able to get units are, in essence, pre-selected to have a vested interest in seeing the HL2 succeed. I discussed this factor briefly in the section “Demos are a Magic Show and Why No Other Reports of Problems?

Introduction and Background

First, thanks to some readers pointing to comments on Facebook’s Hololens Developer’s Group. There are at least two threads in this group that give some public confirmation of this blog’s reports on image quality and yield issues with the Hololens 2 (HL2).

As I reported on February 27, 2019, in Hololens 2 First Impressions: Good Ergonomics, But The LBS Resolution Math Fails!, based on my calculations on the numbers given by Microsoft and my experience with laser beam scanning (LBS) displays, the resolution significantly lower than Microsoft is claiming. As I pointed out in Hololens 2 Video with Microvision “Easter Egg” Plus Some Hololens and Magic Leap Rumors, Microsoft is also having some serious problem just making the HL2

Fundament Resolution Limitation Shows Up in Small Text

I have taken a few key quotes from a thread started by Jie Li, the CEO of DataMesh discussing the problems with the resolution as evidenced by the inability to read small text. For the full context of these comments see my image capture, with my annotations, of the whole thread or this link to the thread [Dec 4th, 2019 – The post/thread at this link was deleted after this article was published].

In the thread, several people confirm that small text is hard to read including the comments that “the default font in the small size is nearly unreadable,” “like a low res CRT display,” and “the scan lines make me feel like playing arcade games 😂 in reality, this makes rendering small font very hard.”

The resolution/aliasing will be made worse in AR applications with SLAM as the high-resolution text will alias (beat against) the scan lines as a person moves. And made worse still with the interlaced laser beam scanning processes as I discussed in Good Ergonomics, But The LBS Resolution Math Fails!. I’m told that there are visible gaps between scan lines and that pixels, that have to be remapped onto the distorted scan line process can appear and disappear depending on how they move relative to the physical LBS scan lines.

While the small text is the “canary in the coal mine” in terms of showing up resolution problems, I would expect to see aliasing and other artifacts occur in any image with sharp edges. They are rending objects of higher resolution than the display can properly display. And as discussed in the Math Fails article, the interlaced LBS scanning process is distorted/non-rectilinear and with a varying resolution across the display which only adds to the problems. The inherent scanning resolution problems are then compounded by non-uniformity issues of diffractive waveguides (see my article: Hololens 2 is Likely Using Laser Beam Scanning Display: Bad Combined with Worse).

While there may be additional problems with the early LBS display engine, as I have pointed out, the scanning process that the Hololens 2 is using is fundamentally too slow to support the resolutions they have claimed. This is “baked into the cake” of the design.

QC Problems with the Diffraction Waveguide Optics

In another thread on the same Facebook Hololens Developer’s Group, also started by Jie Lie, there is a discussion of severe image uniformity problems. For context, I have copied and annotated the whole thread here and provide a link to the thread here [Dec. 4th, 2019 – The post/thread at this link was deleted after this article was published]. The drawing he posted (right) shows various regions across the image where there are no image (dead) and regions where it is lacking one or more colors.

This seems to be a defective unit that was received by a developer. It either got past quality control at the manufacturing facility or was broken in transit (or worse yet considered “good” by Microsoft). Mr. Li’s comment, “new batch arrived and this is worse” implies that the image quality from prior units was also not particularly good and that these new units were significantly worse.

As I have demonstrated many times including in Magic Leap, HoloLens, and Lumus Resolution “Shootout” (ML1 review part 3), image uniformity has been a problem for diffractive waveguides (as shown below). The inherent lack of uniformity makes it hard to know how bad the uniformity has to be before a unit is considered “bad.”

It should also be expected that the image quality will be worse with the “butterfly” waveguide (see Combining Two Bad Concepts for my discussion of the butterfly waveguide) and a wider field of view. So it is hard to know whether Microsoft would consider the earlier units Mr. Li received with uniformity problems, Microsoft would consider to be acceptable. The image below is from the Hololens 1 and shows how the colors vary across the FOV the diffractive waveguide on a production unit.

Hololens 1 showing uniformity issues – Picture was taken by Karl Guttag using KGOnTech Test Pattern

Demos are a Magic Show and Why No Other Reports of Problems?

I constantly try and remind people that “demos are a magic show.” Most people get wowed by the show or being one of the special people to try on a new device. Many in the media may be great at writing, but they are not experts on evaluating displays. The imperfections and problems go unnoticed in a well-crafted demo with someone that is not trained to “look behind the curtain.”

The demos content is often picked to best show off a device and avoid content that might show flaws. For example, content that is busy with lots of visual “noise” will hide problems like image uniformity and dead pixels. Usually, the toughest test patterns are the simplest as one will immediately be able to tell if something is wrong. I typically like patterns with a mostly white screen to check for uniformity and a mostly black screen to check for contrast with some details in the patterns to show resolution and some large spots to check for unwanted reflections. For example, see my test patterns that are free to download. When trying on a headset that supports a web browser, I will navigate to my test pattern page and select one of the test patterns.

Most of the companies that are getting early devices will have a special relationship with the manufacturer. They have a vested interest in seeing that the product succeeds either for their internal program or because they hope to develop software for the device. They certainly won’t want to be seen as causing Microsoft problems. They will tend to direct their negative opinions to the manufacturer and not public forums.

Only with independent testing by people with display experience using their own test content will we understand the image quality of the Hololens 2.

Still No Through The Lens Pictures/Video of HL2 – And A Request for Help

While there are now reports many units “in the wild,” I have still not seen any through the Lens/Optics pictures or videos of HL2. With LBS it can be very tricky to take pictures and it would likely not work with a typical cell phone. As this blog has demonstrated many times, with good camera equipment (I find a mirrorless camera such as the 4/3rds works best) it is possible to take pictures that are representative of what the eye sees.

Videos are always going to show evidence of the scanning process that is a result of the “beat frequency” and scanning differences between the way LBS scans versus a camera. We should be able to get videos as well, but there will be evidence of the scanning process that is not perceived by the human eye.

Like others, I have been trying to order an HL2 since they were announced in February 2019, but for some reason, I am not on the top of their priority list 🙄. If you have an HL2 I could borrow for photographs or if you are willing to take some photographs, please let me know. I’m in the Dallas Texas area, but visit The Valley regularly and will be at CES 2020. Please contact me at: meet-kg-2020@kgontech.com.

LBS Flicker Reported Privately

One of the first reports I got by someone viewing the HL2 is that they noticed image flicker. Some people are more sensitive to flicker than others so I don’t expect that everyone would notice it. But the numbers suggest that there should be a significant amount of 60Hz flicker with HL2. I’m still looking for public confirmation of the flicker and will be testing for it when I get access to a unit.

Conclusions and Analysis

The message streams above, combined with the lack of a general release more than 10 months after HL2’s announcement, suggests that Microsoft is indeed having some serious manufacturing and yield problems. It appears they are having the compounded problem of combining two low yielding subassemblies, the waveguide optics, and the laser beam scanning engine, into a single device per eye.

It is reasonable to assume that MeshData (Jie Li) is not one of the HL2 biggest customers. Likely Microsoft is cherry-picking the best units for its key customers including the ones for the U.S. Army, Toyota, and public demonstrations. MeshData is still “special” in that they are getting any units at all, but it makes you wonder how many units failed QC before being shipped.

I would expect that Microsoft can eventually improve the yield and quality of the diffractive waveguide, but they will always be limited by the inherent uniformity and contrast problems associated with the waveguides. These problems may be acceptable for their intended industrial applications but will never give great image quality.

The low resolution which is showing up as problems in displaying small text is due to the laser beam scanning (LBS) display’s fundamental limitation. The speed of the scanning mirror and the distorted interlaced scanning process (and flicker) is baked into the design. They can try and tweak the scanning beam diameter and hide some of the problems with color choices and content limitations, but it is always going to be a problem/limitation. The display is simply not capable of displaying high-resolution content.


I want to disclose that I am working as Chief Science officer for RAVN, a company working on AR Headsets for military and first responder applications. Since Microsoft Hololens secured a large contract with the U.S. Army, they could be considered a competitor to RAVN. I started covering Hololens several years before my involvement with RAVNThe views expressed in this article represent my own opinion and analysis and not that of RAVN.

Karl Guttag
Karl Guttag
Articles: 244


  1. This is the price you pay for somewhat improved energy efficiency and lower optical losses in the waveguide. I don’t see any other meaningful reason they went with the LBS.
    But was it worth it?

    • Yes, as I wrote in the article https://www.kguttag.com/2019/02/22/hololens-2-combining-two-bad-concepts/:

      While the advantages of LBS are readily apparent, like an iceberg, the serious problems are hidden below the surface. Even Microsoft and hundreds of millions of dollars cannot change the laws of physics.

      I have always said that Hololens looks to me like an R&D project that “escaped from the lab” before it was ready.

      In terms of technology evolution, LBS is using electro-mechanical scanning, a backward step relative to a pure electrical solution. Also, the scan process is inexact like in the days of CRT monitors versus exact pixels so the claimed resolution is significantly less than that which is claimed.

  2. Is the HL2 is the worse, same, or better than HL1 in terms of the resolution of images? I’m just trying to make sense of all of this.
    Thank you for your time.

    • I don’t know for sure as I have not had access to the HL2. Based on my experience with LBS scanning projectors, I would think the effective/measurable resolution of the HL2 is going to be about the same as the HL1. But it could be worse for some things like text.

  3. Hi Karl thanks for sharing the information and analysis. I found your blog recently and I learned a lot by reading through the blogs, I really appreciate detailed analysis.

    I would like to know if you have any comment on the AR startup Kura?
    Looks too good to be true.

  4. Karl. thank you for all yourhard work!!
    unfortunately, it seems that it wont be easy to get a a recording of the HL2 screens via camera. one developer wrote that the image wont come on until it recognizes your eyes. if true, perhaps there’s a way around the eye detection but it could be a while before tries to get around that. here’s the comment

    • Thanks for the information. Yes, if they are detecting the pupil as is said, it is going to make it much more difficult. It will keep most people from getting pictures. Hopefully, there is an easy hack around it.

  5. […] It is now an open secret that Microsoft is having serious problems making the Hololens 2. There are threads on various groups discussing the problems. In addition to the thread with the pictures, the Hololens Sub-Reddit discussion “The elephant in the room: HL2 display” cites and Hololens 2 waveguides in the wild [edit: this second link was included in the photograph but was accidentally left out]. Color banding is very close to in headset experience. confirms the information presented this blog’s articles Hololens 2 is Likely Using Laser Beam Scanning Display: Bad Combined with Worse and Hololens 2 (HL2): “Scan Lines” Making Text Hard to Read and Quality Issues with Waveguides. […]

Leave a Reply