24-bit color sucks

I love technology. But in this series on technology that sucks, I use hyperbole for entertainment and to point out areas that have stalled out. Stalled technological progress sucks.

24-bit color depth is called True Color, a laughably brash marketing term, and probably a remote ancestor of today's High Definition.

I say a remote ancestor because True Color is quite old. It was an undeniably great achievement when it arrived in the 1990s.

Still superior color depth using 30 bits and more (what is now referred to as "Deep Color") arrived in the late 1990s for high-end applications.

Deep Color is yet to arrive to desktop PCs. Rationalized by easily disprovable research from the 1970s, 24-bit color is accepted as "good enough." Academics have told us that human eyes can't distinguish between the 16,777,216 colors provided by 24-bit depth, so we believe that even though it can't be true.

Witness:

Horizontal gradient from #484848 to #494949.
If you can't immediately see the step, hover in this box to reveal a visual guide line.

Above is a linear gradient rendered by your browser between two immediately adjacent shades of gray. If you don't see the step, hover over the box to see a hint. Warning: once seen, it cannot be unseen!

Although a gradient over a tight color-space is useful for this illustration, the problem is evident in many gradients. You can even see the stepping in the background of this blog entry. The steps can become especially visible when the content moves, such as when you use the vertical scrollbar.

Frustrated by the 15-some-odd years of color depth stagnation, I even submitted a Firefox feature request at Mozilla asking for a dithering algorithm to be added to the gradient rendering. Dithering should be familiar to anyone who used computers in the 80s and early 90s when you only had a few shades to work with. It helps compensate for a shallow color depth.

And 24-bit is undeniably shallow in 2012. We deserve Deep Color on desktops. 24-bit color sucks.

Before the iPhone 4 and iPad 3 demonstrated that fanatics pushing for high-density displays were righteous in their fight, many rationalized the prevailing indolence of the display industry as good enough. Now that "Retina Display" has entered popular vernacular, those same people realize low-density was not, in fact, good enough. They won't go back to the iPad 2. What a piece of junk.

With high-DPI displays, the aliasing problem caused by insufficient pixels has been nearly extinguished. Finally. (At least on some devices; desktop displays still desperately need attention.)

Similarly, deep color would help address yet another aliasing problem: insufficient color clarity and accuracy.

We can spare the bits! Modern GPUs come with two gigabytes of on-board memory. That's enough to store 268 million 64-bit pixels. Do you have a display with 268 million pixels?

If so, I envy you.

I received some great feedback from readers. Most importantly, it was pointed out that some low-cost LCD monitors down-sample to an even worse color depth such as 18-bit or lower. In fact, some readers were not able to see the difference in the colors above precisely because their monitor removed the difference by down-sampling.

I imagine them leaning in real close and saying, "What the? Am I being trolled here?" No, you weren't. Well, not by me. You have been trolled by your monitor's manufacturer.

That's really a shame. It's a shame that monitor manufacturers have considered 24-bit color a premium feature. And it's a shame that so many people live with even worse color depth, and in some cases, don't even know it!

Florian Bosch made a bunch of great points in an e-mail, chief among them was that I was being overzealous when I claimed that high-DPI displays have made aliasing problems a relic of the past. Several contexts remain where aliasing is a big problem such as in 3D rendering. It's a good point, and I've tempered the language above in response.
About this blog