A couple of months ago, I editorialized about the power-saving benefits of thin-client devices. I might have been mistaken, however, because at the time I didn't own a power meter, and I based my estimates of power draw on manufacturers' electricity ratings instead of from readings of actual power use. Many of you liked the article, but some of you questioned whether PCs and thin-client devices show any appreciable difference in power use. One reader told me flatly that such a difference doesn't exist (without providing any actual numbers, unfortunately).

Because I couldn't find hard numbers that reflect device-power use instead of power ratings, a couple of cohorts and I decided to run some tests. Using power meters, we compared the power use of Windows terminals (and their supporting servers and hardware) to PCs in a live environment. You can check out our results (free!) online at NCD's Web site; you'll see why we conducted the study, how we performed it, and get a complete look at our results and conclusions. In the meantime, here's a capsule summary of our findings.

Terminals Use One-Seventh the Power of PCs Doing the Same Job
We measured the power draw of PCs and Windows terminals (without monitors) running applications from a terminal server. PCs drew about 69 watts (constant draw); the Windows clients drew about 10 watts.

Monitors Suck Power
Not surprisingly, the lion's share of power use came from CRT monitors, which sucked down about 85 watts each. Sadly, we didn't have any LCD panels to test (anyone want to give me one?), but we did test a prototype Windows terminal with an incorporated LCD panel. The machine with the incorporated LCD display drew about 29 watts of power.

Less Powerful PCs Don't Significantly Decrease Power Use
We ran the live tests with older computers (because that's what the company we tested used), but I performed a lab test on a 1GHz computer with 256MB of RAM to see how much power it drew. The monster desktop used a little less than 70 watts of power when it ran applications from a terminal server and 85 watts when it ran a compute-intensive application locally. In other words, newer PCs don't use more power than older ones, nor does using an old PC save on power costs.

Server-Based Computing Saves Power
One reader asserted that the power draw of terminal servers made up for any power savings realized from using Windows terminals. We checked and discovered that this statement isn't true. A fully loaded Windows terminal server does use more power than even a powerful PC desktop. However, because the terminal server let us run client devices that use a fraction of the power of PC desktops, we found that we broke even with a fairly small number of clients and that the larger the network (and the higher the local power costs), the more money we saved on power costs.

Knowing Your Power Costs Can Help You Calculate TCO
If you don't get anything else out of this column, remember this point—the power component of Total Cost of Ownership (TCO) is easily measurable. TCO is a somewhat hackneyed concept because it can be difficult to measure and isn't always helpful, but you can apply this piece of information to the bottom line when you decide which kind of terminal client to use—it's less expensive to power Windows terminals than PCs.

You might have good reason to use PCs as thin clients; they let you run applications locally (most Windows terminals don't), and you can use the machines you already own rather than invest in new machines. But if you're teetering on the edge and don't have any compelling reasons to use PCs, consider our findings about power usage. These numbers aren't politically correct fluff—there IS a serious difference in power use.