HD sucks

I love technology. But in this series on technology that sucks, I use hyperbole for entertainment and to point out areas that have stalled out. This rant is from 2012. Since then, thanks to the arrival of new companies from Korea and China, affordable 4K is now here and looks to get better in 2014. The point still stands that we lost about a decade of innovation at the hands of HD and its marketers.

High Definition. Using those words to describe a display with a little over 1,000 rows of resolution is even more misleading than calling 24-bit color "True Color."

In the 1990s, I chafed at tuning web sites for 256 colors. Anyone suffering with 256 colors, so I claimed, was accustomed to the web and everything else on their computer looking terrible. Same for the poor folks using 800 x 600. Oh, and at 60Hz! Just thinking about that 60Hz flicker gives me pocket-monster style seizures.

In my mind, those users were hopeless. They were blissfully ignorant. So why fight? In defeat, I wanted to give them what they had come to love: horizontal scrollbars and that psychedelic palette juggling browsers would do.

Wishful thinking, of course. I had to support them just the same.

But to keep my smug on, I cranked my 17" CRT to 1400 x 1050 x 24bpp, and by 1999, a 19" CRT to 1600 x 1200. Both beyond their recommended resolutions.

In my imaginary ideal universe, monitor sizes and resolutions would have steadily increased since then. In fact, they did progress, for a brief time. Laptops from that era offered 1600 x 1200 at 15". The 2004 Toshiba Portege M200 tablet had 1400x1050 at 12". And it was a touch-screen. Even more impressive was the IBM T220 that leapt to 3840 x 2200 in 2001. Sadly, six kilodollars was beyond my budget. I am still filled with regret.

Progress stopped and technology regressed when the displays of the living room converged with those of the workspace. The living room is better for it, having broken the shackles of NTSC. But the desktop display was wounded badly and is still down for the count.

Having converged these two display contexts, manufacturers convinced us that "HD" was good enough in all contexts. 16:9 aspect ratio? Good in all contexts. Obviously! All contexts are watching films, right? Right.

If you don't agree, you're apparently crazy.

Just yesterday, I sampled the specs of dozens of monitors at Fry's Electronics. Every single vertical resolution was 1080 or lower. You can get a 27" desktop monitor with 1080. I can't really express how dumbfounded I am by that.

The "HD" moniker has proven massively successful with consumers. Rather than explain why a 27" monitor has the exact same number of pixels as a 21", the manufacturers just tell us it's HD to shut us up. "Yeah, but HD. H ... D!"

Adding insult, 1080 is still sometimes called "1080p" to indicate progressive scanning as if computer monitors ever used interlacing (sorry Amiga fans!)

HD sucks! It has been a bane of desktop displays for years, and it shocks me that it has such a stranglehold on every manufacturer. Those who do venture beyond HD into the charted-then-uncharted realms of 1200+ lines do so for astonishing price premiums.

Instead of giving us resolution, display manufacturers twiddled away years fussing with the living room to bring us 3D glasses. That would have been fine, of course, had it not served as an excuse to pause innovation elsewhere. I still remember thinking OLED was "5 years away" in 2004. Ah, OLED, how I long to see you in my life.

HD has neutered the desktop display industry. But that just means the display industry is ripe for the taking. I am optimistic that some firm out there wants to earn my money by innovating on the desktop. Please.
About this blog