Utility computing is one of the hottest buzzwords currently circulating through the IT community—and quite possibly one of the most misused. Like Microsoft's ill-defined Microsoft .NET initiative a couple of years ago, utility computing is hidden in a cloud of confusion. (For an update on Microsoft's recent clarification of .NET, see "Is .NET Dead?" November 2003, http://www.winnetmag.com, InstantDoc ID 40464.) You can get a different idea of what utility computing means from everyone you talk to. These competing ideas make determining whether utility computing is a myth or a viable reality difficult.
Recently, I talked with representatives of several large companies (IBM, HP, Microsoft, and others) about their perspective on utility computing. Those discussions helped me identify two visions of utility computing.
At One Level, Nothing New
Several companies view utility computing at its most fundamental level. In that framework, the goal is simply to make computing resources as available and reliable as the services that you get from your water and electric companies. This concept posits a computing environment (i.e., hardware, network, and software) consisting of commodities that you can depend on just as you depend on illumination when you switch on a light. Under this definition of utility computing, you should be able to expect the services that your computing environment provides to be available at all times without fail.
This broad definition covers just about anything (e.g., using a UPS) that makes your computing resources more available. When described this way, utility computing is a reality but it isn't anything new. All vendors strive to make their offerings reliable, and improvements in reliability, although certainly welcome, are well within the current computing paradigm. Many vendors like this description and use it simply to associate their products with the latest buzzwords in the ongoing game of marketing one-upsmanship.
At Another Level, Beyond Current Capabilities
A more ambitious objective for utility computing exists—one that companies such as IBM and HP embrace as the goal for enterprise-level organizations. This conceptualization of utility computing moves beyond mere availability to include shifting resources dynamically across platforms to efficiently match resources to computing demands.
The basic goal of this vision of utility computing is to make more efficient use of the computing resources that you have. In this conceptualization, as more users require an application, the application can seamlessly move to a more powerful platform. A network management component would maximize bandwidth by dynamically and intelligently reconfiguring all computing resources—even the network itself—on the fly at the switch level.
Although the overall concept and goal of this vision are clear, achieving the goal will prove quite difficult. Above all else, this idea of utility computing requires a sophisticated management infrastructure. This infrastructure needs to monitor network resources (including applications) and share network usage information with an environment management layer that has knowledge of the rules that you set up to control dynamic reconfiguration. IBM and HP have started down this path with their current management offerings, but the goal is still far off.
Currently, Microsoft is only talking about this concept of utility computing. The Dynamic Systems Initiative (DSI), which Microsoft announced at this year's Windows Hardware Engineering Conference (WinHEC) in May, speaks to enabling the network to automatically allocate and optimize hardware according to computing demands. But DSI is currently more a statement of future direction than an actual product offering. Microsoft must make more progress with its system management tools before it can really be a player in the utility computing market.
A Future Technology
This ambitious vision of utility computing is a radical step beyond the computing capabilities that are available today: Most companies are still struggling in the early stages of attempting to implement basic network management and monitoring functions. In most cases, hardware and software platforms are too heterogeneous and function too independently to readily lend themselves to this much control and automatic interaction with management platforms. Today—advertising campaigns notwithstanding—this sophisticated definition of utility computing is closer to being a myth than it is to being a reality.
True utility computing remains a distant goal. But the IT computing landscape has a way of changing quickly, and utility computing is an emerging technology trend that bears watching.