I say it over and over again. The Achilles heel of most of the innovation we are seeing in computing is how broadband develops to keep pace with the demands we are placing on the pipes. To be honest, I’m not overly optimistic that the pipes will ever be big enough to keep up. The cost to improve networks is apparently so high that the broadband providers, while attempting to upgrade their systems, can’t keep up currently with the demand. They haven’t figure out how to deal with the current reality, much less any future potential. Apple and AT&T’s iPhone story is going to be on interesting look back years from now when it comes to examining those issues.
So, when I read this piece on Read/Write/Web that in eight years we’ll be measuring video content on the network in exabytes (that’s a billion gigabytes if you’re counting), I have to shake my head and wonder if we’re talking 8 years, 18, or 80. Sure, technological advances happen in huge jumps, but at the current content vs delivery pace, I can’t see it happening that quickly, but maybe I’m just being overly cynical.
With the current pipes we have, broadband providers are already looking for ways to cap or charge for how much data is used. This smacks right up against business models like online backup services, LiveMesh, and others that require connections to be on all the time to really take advantage of them. So, if we’re taking exabytes of video coursing through the network it isn’t just the width of the pipes that will be changing, it will be how the flow through those pipes is paid for that matters as well.
Love to know your thoughts here.