|stating the obvious||archives | about|
Some of you may think I'm nuts, but one day I actually want to go to Comdex.
I can only imagine the wonder of it all... Bright lights! Big names! New products! Earth-moving keynotes! Craps tables! OK, maybe it's just that I've never actually been to Vegas, but something about the mixture of neon, consumer electronics, cheap booze, gambling and Elvis impersonators makes me thrilled to be an American.
Instead, I'm stuck with the web-based dregs...slowly sipping the news headlines (read: press releases) off the wire. The real pundits have been saying for years that nothing exciting happens at Comdex anymore, but I'm convinced they say that just to keep the fake pundits away. I mean, something big's gotta be happening when Bill Gates and Masayoshi Son sit down to hear a speech by Andy Grove. Something really big, right?
Like something on the order of 10-Ghz?
The paranoid one is at it again. At Comdex on Monday, Grove was doing what he does best -- proclaiming visions of silicon glory, with a 200-fold increase in processing power by the year 2011, fitting 1 billion transistors on a chip and flying through 100,000 million instructions per second. Not one to put the cart before the horse, however, Grove admits that they have to produce this chip...not just for the consumer's sake, but to keep their company. "In the future, [chip] plants are going to cost $10 billion to build, we have to figure out how to keep these plants filled. Intel has to help lead development of technology to find the compelling PC technologies which would keep these humongous plants humming along building millions and millions of chips," said Grove.
What are we going to do with all this processing power, you ask? Well, Grove has an interesting answer...and one that we need to pay attention to. In perfect paranoid form, Grove reports: ""My paranoia is that some day the [PC] cycle may sputter and applications may get boring. This is a battle for the eyeballs of the consumer."
For Grove, going after the eyeballs of the consumer means video. And not just TV-quality video on your PC, but an experience that's better than television. A three-dimensional "lifelike," networked, interactive experience.
Sounds wonderful, if you're into that sort of thing. But there's a catch. Some sort of logic would dictate that for us to be living, working, breathing and (most importantly) consuming in this 3D, interactive, lifelike, networked world by the year 2011, that we'd have some sort of primitive version of it here, now, on the information superhypeway we call the Internet. I mean, heck...I may not have a 100 gigahertz chip, but my 133-mhz Dell at least lets me imagine what's possible.
So, where's the nascent digital economy that will eventually bloom into the full-blown digital nirvana that Intel needs in order to survive? Shouldn't we be seeing the beginnings of that world here and now? Where are the digital products? Sure, you can order things online. Books. Records. Flowers. But what have you had delivered digitally to your desktop? A subscription to the Wall Street Journal? An occasional piece of software? Does this constitute a digital economy? Where's the "convergence" we were promised?
The answer is plain and simple, really. The digital economy is being held up by the lack of widely used and trusted copyright protection schemes. Imagine you're a software provider. For you, the net is just another distribution network, albeit with a fairly thin pipe to the consumer. But thanks to the zealous efforts of the Software Publisher's Association, you can sell software on the 'net without worrying too much about piracy. Now imagine that you're the Wall Street Journal. You're not terribly worried about protecting your content, because it has limited value outside of your database and search engine. Sure, an article or two may get lifted from wsj.com and posted on an email list, but the only thing that's doing is building your brand. Pretty nifty, huh?
But imagine for a moment that you're Disney. And you sell images. Images of lions and mice and spotted dogs. To you, convergence means that your images can be reproduced digitally, all over the planet, with a touch of a button. Sure, Negroponte may argue that bits are bits are bits, but some bits are different than others. The key difference between Microsoft bits and Disney bits is that the core value of Microsoft bits aren't splayed all over your monitor in 16 million colors. You can take screenshot after screenshot of Microsoft Excel and it won't do you any good -- you won't have a copy of excel.exe. But take screenshot after screenshot of The Lion King, at say, 24 frames per second, and by golly, you might just have something of value there.
The thing is, Grove is absolutely dead on. For companies like Intel to make it in the long haul, they need to keep building plants to keep building chips to keep driving more advanced applications. And increasingly, those applications will have to be entertainment based, since there's only so much processing power you need to throw at a spreadsheet or a word processor, or even a web browser, for that matter. The key for Grove is eyeballs, he said it himself. And what's he competing with for eyeballs? Television. But without adequate copyright protection and payment schemes, you can bet your bottom dollar that the major media outlets aren't going to start digitizing their corporate assets and plastering them all over the Internet. But a standard for protecting intellectual property isn't going to emerge without a critical mass of users.
Usually, I'd say that the consumer is the one that's missing breakfast in this chicken and egg scenario. But you know what? Consumers have television. The only folks that are aching for the convergence are the technology companies...
Other pieces about hardware: