The only way to save the PC is to kill it

The assumption that PCs will always be around seems intuitive. However, I'm having doubts. When I attended Comdex in November 1999 and saw the latest and greatest PCs, I realized how similar new PCs are to PCs from previous years, including the first PCs in 1981. Although the PC business evokes an image of constant innovation, this image is an illusion. The field is stagnant in many ways, and this stagnation might be paving the way for big changes.

Between 1981 and 1987, PCs changed rapidly—moving from hybrid 8-bit and 16-bit 8088 CPUs to purely 32-bit 386 CPUs, from strictly textual displays to VGA, and from 160KB floppy storage to ubiquitous hard disks and 1.44MB 3.5" disks. But not much has changed since 1987. The standard disk is still 1.44MB, and VGA is still the baseline video mode. The state-of-the-art Intel CPU (i.e., the Pentium III processor) is a revved-up 386 with an enhanced instruction set. Intel introduced the 32-bit 386 in 1985, and 15 years later we're still waiting for the 64-bit future that the company promised with its Itanium (formerly code-named Merced) chip. Although CPUs' clock rates appear to have increased markedly, they really haven't. The 386's external clock rate reached 33MHz about 10 years ago, and the so-called 500MHz Pentium III processor's external clock rate is only three times that, at 100MHz.

This reduced pace of innovation isn't entirely Intel's fault. When a company designs a new generation of CPUs, the most important feature the new CPUs must have isn't faster speed, a bigger data path, or a beefier instruction set; rather, the most important feature for a new chip is backward compatibility. And having to make CPUs backward compatible limits Intel's innovation options more as time goes on.

Being a prisoner of past market success isn't a new phenomenon in the computer business; for example, this scenario hurt IBM terribly in the 1970s. The 360/370/390 line of mainframes was so popular that IBM didn't dare change too much with each model, except to increase processor speed. But IBM realized that just doing the same thing repeatedly (i.e., creating ever-faster code museums whose prime directive was to run programs written years ago) would doom the company to a shrinking market share. Thus, IBM initiated the Future Systems project, which was supposed to create a superior processor architecture, although at the price of backward compatibility. Customers spoke in one voice to this offering, saying "no thanks" to a better but incompatible system. IBM subsequently scrapped the Future Systems project.

Fortunately, this story has a happy ending. IBM eventually revived the Future Systems project with the popular AS/400 series of computers, thus creating a new market for itself (in addition to the 360/370/390-type systems). But the customers who now use the AS/400 line of computers wouldn't have those computers if IBM had tried to simply evolve its mainframes into AS/400s. The company had to create the AS/400 through a revolution rather than an evolution.

The Wintel PC standard has served desktop users as well as the 360/370/390 standard has served businesses. But the Wintel standard has limited our options for innovation on the desktop. Soon we might grow so tired of those limitations that we're willing to accept the cost and trouble of adopting a new platform—a revolution instead of an evolution.

Suggesting that the PC's days are numbered might seem ludicrous today, but change happens quickly in this business. The Internet went from geeks-only to open access in a staggeringly short amount of time. And 3Com has sold a lot of PalmPilots recently. You might start to wonder whether holding on to your Dell, Gateway, or Compaq stock is such a good idea.