Boring technology

I love technology. But in this series on technology that sucks, I use hyperbole for entertainment and to point out areas that have stalled out. Stalled technological progress sucks.

I don't read much TechCrunch, but Michael Arrington's rant caught my eye while skimming Hacker News headlines. Mr. Arrington says he's bored with technology in 2013.

I've used similar hyperbole: technology sucks. Exaggeration side, I don't hate technology and I'd guess neither does Arrington, but I don't like much of what we're doing with technology today. In many areas of interest to me, technology has become stagnant. Areas where I'd most like to see innovation are being neglected in favor of areas I personally find quite silly.

Everyone has said their piece about Instagram and its acquisition, so I'll keep this bit short. Instagram's acquisition, it's very existence and tremendous popularity, altogether embodies 2012 and a few years leading. Where I want greater fidelity of digital imagery (deep color, higher resolution displays, migration away from lossy compression), a culture that welcomes and rewards Instagram passively disagrees with me. It would be hard to create something that I find less appealing than Instagram without being outright malicious.

I'm sure they've innovated in some technical areas as their scale grew, so even R&D in areas I find wholly uninteresting helps propel technology forward in some general fashion. But I can't help but characterize the whole enterprise as misallocation of resource.

That's emotion, though. Cognitively, I don't actually think that. Consumer reaction to Instagram and many other similarly boring-to-me services and technologies has been very positive, so clearly the allocation of resource has been proper from the perspective of the market. It's simply a matter that my opinion represents a minority with very little influence.

It's therefore fascinating to witness someone with strong influence—Mr. Arrington—put into words thoughts that are vaguely similar.

Arrington makes this excellent point:

Yeah, the iPhone and Android are great. But seriously, look at the top headline grabbers in tech news in 2012. Apple. Google. Facebook. Microsoft. Christ. It might as well still be 2007.

The technology press is obsessed with fashionable "teams," be they companies, sub-industries (social, mobile, cloud), or even the processes by which new ideas and firms are to be accepted. There even seems to be a process to disrupting things now.

It's easy to point to the science-fiction standards that evade us still, such as flying cars and replicators, but these high-profile examples represent hard problems that probably require steady progress across many disciplines (in other words, lots of time and effort). Among other things, flying cars may need a few more Teslas to lead the way.

I'd rather look past those and discuss simpler areas that have been neglected or side-lined.

Technology serves the person

Apple decreed this the Post-PC era. And indeed, the personal element of computing is on hiatus.

The ideal I want: virtualized, private, end-to-end encrypted networks to store my data, my applications, and my devices, with ingress and egress points managed by myself and my family (e.g., for sharing photos), and a federation of trusted peers providing encrypted back-up capacity.

For example: I want my cell phone to be on my private data network, always.

In some ways this is putting protocols over products, but that's too reductive. If you continue reducing it, it's "the web," after all. So to clarify: what I would have preferred is a maturation of the web from 5-10 years ago.

I would have preferred that embryonic efforts of the early 2000s, such as personal VPNs, had been more fully embraced and matured to the point they were simple to setup and use, easy to upgrade, and provided personal control to their owners.

What we have instead is a world where social trumps personal. Obviously, social isn't bad, but it serves as all-too-easy a wedge to give users pseudo-personal solutions that are not, in fact, personal. Instead of storing my photos on network-connected devices I own and control, I select a provider for this service. Instead of storing my music on my devices, I select a provider for that service too. It seems superficially personal, but it's more about aggregation, pattern analysis, user friction, and monetization—be it plain or fumbling.

Clearly, asking the user to manage their own digital world while promising to allow them to do so easily is a difficult problem. It's much harder than just saying, "this is a tough problem, why not just allow us to handle this whole photo storage and sharing thing for you, top to bottom?" In other words, it's obvious that building a set of federation protocols and implementations that equal Facebook's ease-of-use is considerably more difficult than building Facebook itself.

Still, I don't think it's simply a matter of difficulty. It's now a matter of preference by vendors. I find my Nokia Lumia 920 doesn't even join my Windows HomeGroup, meaning it doesn't play my music, but rather Microsoft's music via the "Xbox Music" application. It would have been relatively effortless for Microsoft to make Windows Phone 8 join a Windows 7/8 home network. But better (for Microsoft, I guess?) to convince me to use the Microsoft cloud.

Right?

Convenience trumps security, control, privacy, and personality. It's easier to build for convenience in lieu of the others. But eventually, I'd like to see all of these goals realized, together.

We want personal continuity

I can't speak for everyone (and given what I said earlier, I shouldn't even try). Still, I suspect that most of us clamor for more continuity from our technology.

Today, we achieve a measure of continuity from what we now call cloud service providers. Web-based e-mail beats out traditional e-mail for many of us because it's available everywhere. Amazon cloud player beats personal storage of music because it's available everywhere (well, assuming you fully adopt it and upload your entire library). Evernote trumps Onenote because it's available everywhere.

Trouble is, continuity still largely sucks:

  • Every additional device we add to our life is another potential inconsistency. Different applications, different data, different capabilities.
  • Because of the first reason, we seek out cloud providers to help synchronize the data available across our various devices. Either we attempt to find a best-of-breed offering in each space and suffer the consequences of vendor strife (Evernote vs Gmail vs Amazon Cloud Player vs Skydrive vs ...) or we concede domination by a single provider (Gmail, Google Docs, Google Drive, ...)

Rather than continuity we have fragmentation. We solve that with synchronization, but synchronization sucks. So sometimes we don't bother with synchronization and instead select Let-Me-Handle-That-For-You (see above), which sucks for different reasons.

Life wasn't perfect when we were responsible for our own files and applications, but I'd rather see my documents as things ("files") that I control and choose applications to work with those things. Once selected, I want to use the same application in every context.

Comparing 2013's cloud versus personal data management from 2003 is a little unfair. I'd ask that if you disagree, at least consider the alternate universe where personal data management progressed uninterrupted for the past decade, and the R&D that went into major vendor's cloud products was diverted instead into personal network and data management. Maybe we'd be in a better place—all things considered—than we are today. I for one, would likely be happier.

I've described a theoretical application and computing model I've called personal application omnipresence where continuity is paramount.

Innovate on the desktop

I've beat this horse too many times. But the basic gist is this: large, high-resolution displays are marginalized. Stop this nonsense. The rest would fall into place around large displays, which would be the first step forward on the desktop in years.

The desktop PC isn't neglected by users because they somehow don't do work at desks, but because the desktop PC's technological progress has stalled.

Stop tolerating “good enough”

In a recent e-mail conversation with a reader, I illustrated my frustration with these data.

Consumer bandwidth available to me ("available" meaning affordable for a consumer in my community within Los Angeles):

YearBandwidthVersus previous decade
199014.4Kbps
20001.5Mbps100x
2010~15Mbps10x

I wanted 150Mbps by 2010.

Storage capacity available to me:

YearCapacityVersus previous decade
1990~200MB
2000~100GB500x
2010~2TB20x

I would have expected mainstream desktop storage to be approximately 20TB by today.

I contend the incentives to innovate in bandwidth, storage, and other metrics of computing progress have diminished as a result of diminished demand. In both bandwidth and storage, we've reached a point where a majority of users consider the service "good enough."

Trouble is, it's not good enough to deliver uncompressed (or substantially less-compressed) truly high-resolution video to the home, or any number of other innovative ways to use higher bandwidth, higher capacity, and so on. That's why the slow-down in progress is disappointing: we don't even know what we're missing out on.
About this blog