Displays are the key

I'm not much of a hipster, but I will gladly stake my claim to having been into high-DPI way before it was cool. Now even Linus Torvalds is in on the game.

If you are a technophile—if you are genuinely pleased by technology's progress—you should feel cheated by the tragic history of desktop display technology. In 2001, IBM released a 22" desktop LCD monitor with a breathtaking resolution of 3840x2400. That was over 200 DPI, putting it in the realm of print material. In 2001! You and I were using Pentium IIIs and Athlons.

Eleven years have passed. We've not only seen no progress in desktop displays, but technology has actually regressed. Today, the industry's best effort is seven-year old 30" LCD panel technology at 2560x1600. Worse, the price of these unimpressive-but-what-other-option-do-you-have 30" monitors has been frozen at $1,100 for all of those seven years.

The IBM T220 monitor was $6,500 in 2001. A bank-breaker to be certain. But recall that the first plasma televisions were over ten thousand dollars. Now you can probably find one in your neighbor's recycling bin buried under soda cans.

The IBM technology should have been embraced and mainstreamed rather than simply discarded in favor of phony "high definition" marketing speak (only when compared to NTSC broadcast television could 1080 rows ever be considered "high definition"). Display manufacturers realized it was easier to keep selling low-density displays at great profit by merely calling them high definition.

To this day, I get twitchy just thinking about how technology's inevitable progress was made so very, uh, ... evitable. Things are finally looking up, but not nearly as much as they should be.

Still, I think a lot of people didn't care that this was happening throughout the 2000s and they still don't really care now. They know they want the new "Retina" displays on their Apple iProducts because Apple tells them they do. But few stop to ask why we don't have that same technology on the desktop. I appreciate Apple's kick in the rear of display manufacturers, but Apple isn't helping matters much by trying as hard as possible to write an end to the desktop computing chapter of history.

I don't agree with Apple's whole "post PC" mantra. In fact, I feel one of the chief reasons the desktop market has been stagnant for years can be traced specifically to the stagnation with displays!

HP, Dell, Lenovo, Acer, Asus, and all other desktop "PC" manufacturers, please pay attention. You sell new PCs by convincing people that their current PC is outdated. Simple enough, right? Well, you have a bit of a problem:

  • For browsing Facebook, writing documents in Microsoft Word or Google docs, and watching YouTube, just about any CPU will do.
  • Unless you play the latest games, any GPU will do. (After all, even the very largest of monitors are only 2560x1600.)
  • Unless you plan to store a massive library of pirated films on your own disks, just about any modern disk will do.
  • Broadband has been fairly stagnant, so the gigabit Ethernet port on most computers is way beyond our meager Internet bandwidth.

Just about all of this would change with ultra-large high-DPI desktop displays.

If instead of snuffing out high-DPI LCDs in 2001; if instead display technology were allowed to continue to march forward, where would we be today? I like to imagine that we'd have 50-inch desktop displays with a resolution around 20,000x15,000. But I'll concede that might be wildly optimistic.

Let's go with something more plausible like 40-inch monitors with 8000x5000 or thereabouts. Surely in eleven years we could progress from 3840 to 8000 horizontal pixels, had we not given up, yes?

Such a display would push all the right consumer buttons. You'd need a faster CPU to deal with a windowing environment composed of so many pixels. Obviously, the GPU would need to be top-notch. Disks would need to be larger to store video with such a high resolution. Even ten-gigabit Ethernet would need to be mainstream for moving files of that size.

I would say the desktop is most certainly not dead. It has been neglected by hardware manufacturers all too willing to concede any responsibility to innovate and just follow the lead of Apple.

Television should not be the driver of large form-factor display technology. Desktop PCs should be pushing display innovation. We should be dealing with Minority-Report or Avatar style desktop displays. Okay, not the transparent glass, but large form-factor with print-grade density. That transparent glass just makes for exciting visuals in a science-fiction movie.

When monitors were monitors and televisions were televisions, monitors saw real innovation. When the two converged, monitors were squeezed into the specifications that made sense for a living room, not a desktop computer.

This should be reversed. Monitors are for desktop viewing at a distance of one to two feet. They are for displaying data. Lots of data. They are for interaction. They should be large and high resolution. 2560x1600 is a joke. 1366x768 is not even a joke; it's just plain wrong.

Thank you, Apple, for righting the display industry at small form-factors. Now, would someone please pick up the mantle and get the job done on the desktop already?

We've been waiting eleven years.
About this blog