During the last 30 years, the balance of power in the enterprise has shifted. When I first began working as a programmer (mind you, this was when Led Zeppelin was still touring), the power belonged to the high priests of the mainframe. Users had terminals, and their applications were centrally controlled and maintained. We know what's happened since then: Individual users now have, in aggregate, a lot more computing power than the mainframe clergy ever dreamed of, and they aren't afraid to use it.

By its very nature, however, messaging still involves centralized servers, and the performance and integrity of those servers is obviously important to the administrators and users who depend on them. This theme sounds like a return to the old days, and in some ways it is. But today's users are much more sophisticated and savvy, and they have much higher expectations. This situation leads to conversations such as this:
User: Outlook is slow.
Admin: How slow is it?
User: Well, you know, it's slow. It used to be faster.
Admin: When did it start slowing down?
User: I don't know, but it definitely used to be faster.
Admin:

Monitoring the overall performance of your Exchange server is relatively easy. In fact, by watching half a dozen counters in the Performance Monitor (such as the disk queue length, the number of remote procedure call--RPC--operations per unit time, and the amount of page faults per second) or by using the synthetic alerting features in Microsoft Operations Manager (MOM) and other management tools, you can get a good view of how your servers are performing. The data you gather with these tools tells you only how the server's doing overall, however; it tells you nothing about particular users, and that's often what you most want to know. In the aggregate, however, this data is useful. By careful tweaking, you can simulate realistic user loads with a tool such as LoadSim or the Exchange Server Stress and Performance (ESP) tool, which often gives you insight into performance bottlenecks on your server.

You can certainly find ways to get some level of performance data for individual users; for example, you can use a network monitor or similar tools to watch the network traffic between the client and the server. Doing so isn't always feasible, however, especially if the user who reported a slowdown is outside your corporate network. You can also monitor the client directly to see whether other operations are slow, too--what's reported as an Outlook slowdown might well be caused by another application or by other problems that have nothing to do with messaging. (My favorite example of this is when a well-known application for managing MP3 players has a bug that forces all other application threads spawned after its launch to a lower priority, slowing down every other application on the machine.)
Because Exchange uses the concept of maintaining an independent session for each user, I've always assumed that it must have some amount of performance data that could be exposed but isn't. Exchange can gather per-user performance data, but we haven't had any supportable way of getting to the data. With last week's release of the Exchange Server User Monitor (ExMon) tool, that situation has changed. ExMon, which is part of the most recent Microsoft Web release package of Exchange tools, collects and exposes performance and resource-consumption data for individual user sessions. Now you can answer questions such as how much CPU time is being used to handle individual user requests; in this context, the "user" could be any application that logs on to the Information Store, including virus scanners, backup programs, and similar tools.
Next week I'll describe how ExMon works and what you can do with it. Until then, you can download the tool from the Exchange tools Web site at http://download.microsoft.com/download/3/5/b/35b64e21-44b6-4d30-b4d2-3a41f3635e7e/Exmon.msi . ExMon requires Exchange 2003 Service Pack 1 (SP1) or Exchange 2000 SP3. The tool includes documentation and a set of release notes that will tide you over until next week.