Google Glass Is Using Field Sequential Color (FSC) LCOS (Likely Himax)

GG DVF 40-42 RGB (2)
Sequential Red, Green, and Blue Fields Captured From Google YouTube Video DVF [through Glass]

I’m going to have to eat some crow because up until Saturday night, I honestly thought Google was using a transmissive panel based on the shape of the newer Google Glass headset.  I hadn’t seen anything that showed it used Field Sequential Color (FSC) and I had looked for it in several videos before that didn’t appear to show it.  With FSC the various (red, green, blue and perhaps other colors) are presented to the eye in sequence rather than all at the same time and this can show up in videos (usually) and in sometimes in still pictures.

But on a Saturday (March 9th)  I watch the Google produced Video DVF [through Glass] from way back in September 2012.  A careful frame by frame analysis (see above for the images from 3 frames) of the video proves that the newer Google Glass design uses a Field Sequential Color display (FSC).  Note in the picture above captured at 3 separate times, there is a red, green, and blue images in the Google Glass which is indicative of FSC.   Based on the size and shape and some other technical factors (too much to go into here), it has to be a reflective Liquid Crystal on Silicon (LCOS) device, most likely made by Himax.

BTW, as further visual evidence (there are a couple more examples in the video but this one is to me the clearest) of it being an FSC device is given later in the video at 3:30 when Google Co-Founder (and part-time actor?) Sergey Brin wearing Google Glass stands up to applaud and there is a classic FSC color breakup as captured in the picture below one recognizable to anyone that has looked into an FSC projector.  Seeing separate color fields when the projector moves is a classic FSC effect.

GG man jumping up
Sergey Brin Stands Up Rapidly and Reveals Color Sequential Breakup

This (new) evidence largely confirms Seeking Alpha Blogger Mark Gomes conclusion that Himax is in both the old and the newer Google Glass design  (see also his instablog response to my comments).   Back last week I was not convinced and commented that I still thought it was a transmissive panel and Mr. Gomes and I has some cordial back and forth public discussion in each others blogs about it on Seeking Alpha and this blog.   But with the proof that it is using field sequential color, there is only one conclusion and that is that it is a reflective field sequential color LCOS device.   This also adds up as to why the earlier prototype was using a Himax Color Filter LCOS device when it would have been simpler and smaller to have used a transmissive panel at that time.  Apparently the color filter LCOS was a “stand-in” waiting for the smaller field sequential color device and/or optics.

Additionally, while I had dismissed the Digitimes Himax and Google Glass article as confirming it was Himax because it appeared a couple of days after Mark Gomes’ article and so I thought it was just an “echo” of what he and I had written.   But in public comments Mr. Gomes pointed out that it was adding some more details.

So why do I now agree with Mr. Gomes that the Google Glasses most likely uses a Himax panel?  The evidence is overwhelming that it is field sequential color and it seems that Himax is the obvious candidate since in my first blog on the subject appear Feb 28, 2012 clearly identified Himax as supplying the earlier Google Glass prototype and they have had FSC LCOS devices for about 6 years.    This is further reinforced by what Mark Gomes has posted as well as the Digitimes article.   Both the technical and the financial/business analysis agree.

There are a few other but IMO much less likely candidates.  My old company Syndiant has digital field FSC LCOS technology that last I knew about both was technically  superior to that of Himax’s analog LCOS technology, but I don’t think Syndiant would be ready for a Google sized order yet (and the announced JVC-Kenwood deal happened too recently).  Citizen Finetech Miyota (CFM) recently bought FSC LCOS technology from Micron, but I can’t see why Micron would have sold the technology to CFM if a deal with Google was in the works.   Omnivision bought the the FSC LCOS technology of Aurora Systems, but it was not very good technology IMO and so far I only know of the continuing to make the old Aurora devices which are aimed at front projectors.   Then there is Compound Photonics who bought the FSC assets from the now defunct Brillian but they have stated that they are working on  laser pico projectors.

Also, please don’t give me the conspiracy and collusion theories.   The video I watched on March 9th was the first one I had seen that proved Google Glass was field sequential color.  Additionally, I never corresponded with or even knew of Mark Gomes before the Seeking Alpha article came out mentioning my blog and I was legitimately concerned that he may have ignored some of my original article and only considered the parts that supported his position so I wanted to correct the record.  Mark Gomes for his part was very respectful, yet emphatic in his position based on his research which now appears to me to have been largely correct (although I still say the Himax web site looks abandoned and Himax did give the appearance of having given up on FSC LCOS back around 2010).   Frankly, I was as surprise as anyone at the wild swings in Himax stock and didn’t buy any before my first article.

Full Disclosure:  I never traded in Himax stock before today (or any other stock discussed on this blog other than being a well know holder of the private company Syndiant stock as a form Founder, CTO, and Investor).  But seeing how the Google Glass news last week affected the stock and based on Mr. Gomes’ articles, combined with this new evidence, I decide to put some money where my mouth is and just bought some Himax (HIMX) to see what happens.

Appendix (For Those that Want to duplicate my findings)

Figuring out that Google Glass used FSC would have been instantly recognizable to anyone that got to use the newer Google Glass device, but I didn’t have one to play with and I was using the available on-line video and pictures.   The crafted Google videos that give the appearance of looking through the Google Glass didn’t show this because they simulation of the display.  And in most of the videos the image in the Google Glass was not visible and/or the camera exposure and other settings didn’t pick up the FSC effects.  Perhaps Ironically, it appears that the camera in Google Glass tends to pick up the FSC effect more than other cameras used to shoot pictures of people wearing it.

Some video cameras more so than others will tend to pick up the signature color breakup of FSC.   Also the camera angle has to be right so you can see the image when videoing someone wearing Google Glass.   And perhaps most importantly, the exposure of the camera, which is usually based on the overall scene, has to be such that the sequential colors from the small spot of light in the viewfinder (haven’t ever seen a close up of the viewfinder) does not over-expose and wash out the colors (in this case you may notice a more white flicker).

All I did was play the video DVF [through Glass]  on my PC and kept pausing and un-pausing it.  It is tricky to catch the frames that show FSC.  One reason is that the video has many frames per second and the Youtube player does not support “shuttle/jog” frame by frame.   One could download the video and play it frame by frame but it is not necessary.   I just kept going over the time around 0:38 to 0:44 a few times to capture the images.   Similarly went through the video at about 3:30 to get the FSC breakup with Sergey Brin.

Note that you will not always see a red, green, or blue color when you capture a frame.   When colors get too bright in the image, it will saturate the camera sensor and result in white.     I don’t believe there is a “white field” in the Google Glass but rather it is just that the camera is not picking up the colors due to over saturation.

I should also add that FSC effects show up differently on different cameras and in different lighting and camera exposure.   I have looked previously at other Google Glass stills and videos trying to find FSC effect and did not find them.    Unless the camera angle and the exposure is right, you just aren’t going to see the colors.    Even in this whole video, I only found a few seconds of video that demonstrated FSC.

Karl Guttag
Karl Guttag
Articles: 247


  1. Karl,

    Can you direct readers of this blog to another field sequential LCOS video that exhibits what can be seen in the DVF video?

  2. Congrats to you and Mark for making the Mosaic clearer. Gomes, I will have to start reading your work. And Karl I hope too on Seeking Alpha as well since your insights deserve a larger audience.

    Ps..I’ll take a punt on some call options in case this product sells better than expected.

    • Thanks Malone,

      Make sure you read both sides of the story as I gave both in the same day if you read my blog about Color Filter LCOS and the Seeking Alpha interview that was submitted on Friday before I discovered that Google Glass was now using Field Sequential Color (Murphy won on that one). I kind of feel like and Economist which is say “on the other hand . . .”

      I believe what I wrote on Seeking Alpha Interview about Google Glass still stands (but I had the panel type wrong). I just can’t see Google Glass being a high volume product the way it is. There are too many human interface issues yet to be solved (see also my blog from about a year part 1 and part 2).

      Augmented reality is “seductive” and a lot of companies have tried it, but non has succeeded beyond some specialty markets for military and industrial use, but the consumer market has eluded them. I don’t see where Google Glass has come close enough to solving the well known problems and has repeated a number of past mistakes.


  3. At least it is starting to appear in form factors that don’t look like they are from a scifi movie. I hope for modest success and then in 5 years it will be a mass market product. I almost grabbed some today as it dipped into the low 4s but I’m going to wait as the the Sept 2.50 calls are starting to look cheap near $1.25. It seems they might do near 1B in revenues if they can make 200M+ annually off these things and earnings could get near 50 cents (in a perfect world). I’m still interested in what the selling price of the panels are though? I’d imagine GOOGLE would have had the power in the negotiation.

    • Please note, I still have serious reservations about the whole AR concept see posts under HMD and AR and in particular the March 03 2012 (first) that shows some of the history of HMD/AR/Wearable computing. That has not change with my revelation that Himax FSC LCOS is likely the panel in Google Glass.

      To me it is not just a question of whether Google Glass will succeed, but rather will anybody including Google, Apple, Samsung or anyone else solve enough of the issues with near eye AR to have really usable product for a mass audience. Currently I think that only the tip of the iceberg of the problems associated with head mounted AR are understood, let alone solved, and that the problem is still bigger than all the smart people at all the big name companies. The problems range from everything from comfort, eyestrain, to safety (distracted walking/driving), to how are you going to input into the device (talking is definitely not a good primary input for a consumer device) just to name a few. I joke I am working on naming “the 101 problems with Head Mounted AR”.

      I have seen multiple major companies chasing each other’s tails other “big new concepts” before only to see them all give up when none of them could solve the problem. Some companies are likely looking at it just because the other guy is. I have a saying that “sometimes the problem is bigger than all the smart people can solve.”

      Mr. Gomes and I have had some fun back and forth on these issues. He is approaching the question from a “market analyst perspective” and I am looking at it as a technologist that has worked on many devices for the creation of images including near eye displays.

  4. great work Karl (and i do mean great. thanks)
    one thing still bothers me about a panel display. the light (LED) sources location.
    where does the light have to be to make a FSC LCOS work?

    • I’m not sure exactly what you mean by “where does the light have to be to make a FSC LCOS work?” The light configuration for near eye LCOS FSC has to be similar to that for the color filter LCOS only the “white” LED is replaced by 3 LEDs and a series of plastic diffusers and polarizing films. Near eye does not have to be very efficient and a plastic film can be used for the beam splitter rather than a glass “cube” beam splitter. The plastic film beam splitter can be curved to make optical path a bit more compact and “moves” where the LED has to be relative to the panel. The FSC panel is much smaller display than the earlier color filter device and they found a way to pack it all in.

      • Karl,
        i can’t get my little mind to see where google is placing their light.
        is it possible to use just the display panel light for daylight viewing?

        any guesses as to run time?

        thanks for your time replying

      • Osram certainly makes some of the best LEDs in the world in terms of efficiency for bright LEDs. But there are many suppliers of LEDs that could be used since LED performance is not that critical in near eye.

      • The most common configuration would be like the one for the Color Filter LCOS in the earlier Google Prototype. With cube beam splitter, the panel goes on one side, the LED on another, the light goes out the side opposite the panel. If you use a film type beam splitter, you curve it so that the LED can be on the same PC board as the panel and the curved film aims the light at the panel.

        For the small image in the Google Glass, they only need about 0.25 lumens (1/4th of a lumen) since the light is being directed into the eye. With modern LEDs that is not a lot of power required. The run time would depend on the battery and other components as well as the resolution as everything from processing to display has a “power per pixel” component to it and with there not being a lot of LED power, the power of the panel and electronics becomes a significant factor.


  5. Hello Karl,

    You have made a very interesting analysis, but could you help me understand this technology in a bit more depth?

    I can not find the technical specifications on the Himax products, but my first question speaks to some of the other requirements which a Google Glass product would need to meet, like surviving on a car dash in AZ in the summer, or AK in January. .

    Since the Kopin Cyber display and IC have faced the rigors of the US Army, I have I listed a few of their published technical specifications below.

    Operating Temperature 0°C to 60°C

    Vibration 20 to 2,000 Hz, 6G’s RMS maximum 3 axes

    Brightness up to 2000 nits for sunlight readability and low power consumption for full-day operation

    The integrated imaging solution consumes less than 150 mW at 1000 nits in full video mode and about 100 mW in binary color mode.

    Can the Himax product be expected to meet thermal demands, shock and vibration, high brightness in sunlight and low power consumption?

    My second question has to deal with whether a product designed for a Pico projector can really by be considered the optimal design for a glass product. Again, here is a note from a Kopin press release noting a purpose built design in their Cyber display.

    The display, backlight and ASIC have been specially designed to meet the needs for this application.

    Your comments or thoughts would be greatly appreciated.

    Thank you,

Leave a Reply

%d bloggers like this: