Downloads
16045.zip

I'm launching a Web site and trying to determine what kind of computer I need. Do guidelines or references exist to help me with this decision?

This question is common and difficult. No formulaic answer exists because a great deal depends on the specifics of your circumstance. How many simultaneous users are you expecting? What's your average page size? Are you using dynamic content? Do you need Secure Sockets Layer (SSL)? All these questions are important and will direct you in making a decision about how big and fast your IIS server hardware must be. Rather than try to create a formula, I'll provide some general guidelines that can help.

Perhaps the most underappreciated aspect of IIS is its ability to deliver content quickly when running on modest servers. I continually run across sites that are delivering substantial volumes with smaller computers than you might imagine. For example, I recently encountered an IIS server delivering 1 million page views a month running on a machine with a 500MHz Pentium II processor and 256MB of RAM. This machine connected to a Microsoft SQL Server machine delivering database queries.

Bear in mind that 1 million hits daily isn't a particularly heavy load when it's distributed evenly over 24 hours. (It works out to about 11.5 hits per second.) If you're delivering an average of 10KB per hit, you need a computer that can deliver 115KBps of content, which is no problem for a basic Pentium system. This example presumes static HTML because you need more processing power to deliver that much data through Active Server Pages (ASP). Also, you need to consider that traffic is rarely distributed evenly over a 24-hour period; peak traffic occurs around noon on public Web servers. It's the old 80/20 rule at its finest—80 percent of your traffic will occur within 4 hours every day.

The point here is that most sites don't need Quad Xeon CPUs, 1GB of RAM, and Ultra SCSI RAID 10 systems to deliver significant volumes of content. Each situation differs. Clearly, if you have complex applications to deliver to tens of thousands of users, you need more horsepower than just straight HTML.

By far, the most significant performance characteristic of an IIS server is the amount of RAM available. You want to be liberal in your estimates of RAM. If you oversize, it won't cost you in overhead, and you can grow into it. In contrast, if you add CPUs you don't need, your system can spend more time managing the additional complexity of extra CPUs than benefiting from the additional computing capacity. The reason available RAM makes such a difference is that the more RAM you have, the more file caching can be accommodated. Caching makes all the difference in delivery time.

To take performance further, your system will obviously benefit from fast hard disks and CPUs. Using multiple-CPU machines is common for heavy-duty Web servers. Hardware-or software-load-balanced clusters of Web servers are the absolute maximum performance you can achieve; such servers are the current state of the art in delivering access to high-volume e-commerce sites. You need reliable network bandwidth to match your megaservers, or you've done little more than put a jet engine on a go-cart—a lot of power, but no vehicle.

I would be remiss if I didn't point out that IIS 5.0 is significantly faster than IIS 4.0, so a quick way to gain performance improvements without necessarily upgrading your hardware is to upgrade to IIS 5.0. (For information about upgrading, see my article "Upgrading to IIS 5" at http://www.iisanswers.com.)

The best you can do is stress-test your servers in as realistic an environment as possible. Such testing can turn up deficiencies of scale that aren't obvious under smaller loads. The biggest problems can occur when you're completely optimized in terms of hardware and OS tuning only to realize that your Web application simply can't manage the loads you need it to bear. This problem can occur for several reasons, such as the use of COM components that aren't fully optimized for multithreading (as in Microsoft Visual Basic—VB—components) or implementing inefficient designs that are unnecessarily complex. When you encounter such bottlenecks, you can do little other than start recoding your software. For this reason, stress testing is essential. (See Related Reading for stress-testing resources.)

I'm running two Windows NT 4.0 servers: Server A is the PDC, and server B runs IIS and is a member of the domain that the PDC manages. Server A has a directory in which I keep ASP scripts. On that folder, NTFS permissions let a user on the PDC and a local user on server B access the files. When I try to use Microsoft Internet Explorer (IE) to access the ASP scripts, the local account authenticates correctly, but the PDC account fails and repeatedly asks for a logon. I've tried both NT Challenge/Response authentication and Basic authentication without success. Why am I encountering this problem?

Various authentication methods require various user rights. NT Challenge/Response authentication requires that users have the Access this computer from network right on the Web server. Basic authentication requires that users have the Log on locally right on the Web server.

You might need to enter the username as domain\username instead of just username. You can set IIS up to log on to a specified domain or to all trusted domains. For more information, see the Microsoft article "How to Authenticate a User Against All Trusted Domains" (http://www.microsoft.com/technet/support/kb.asp?id=168908) and Chapter 8 of the Microsoft Internet Information Server 4.0 Resource Kit.

I want to use the ASP Response.Expires method to control how frequently a client updates my Web pages. In each page, I've inserted the code

<%
Response.Expires = 0
Response.AddHeader _
"pragma", "no-cache"
Response.CacheControl _
= "no-cache"
%>

which forces the client to refresh and reload the page. However, this method doesn't seem to work reliably. Sometimes, IE 5.0 sends the message Web Page has expired. Other times, IE refreshes the page as it should. How can I ensure that this method functions consistently?

Use Response.Expires = -1 instead of Response.Expires = 0. This change will fix the immediate problem, but you might need to take additional steps.

In case you're unfamiliar with this subject, the Response.Expires method lets you specify when the Web page containing the Expires value expires and needs to be refreshed (i.e., downloaded again from the Web server). This topic can be rather complex because several potential places exist in which a Web page can be cached. First, by default IIS caches Internet Server API (ISAPI) applications, which means that IIS caches your ASP pages. Second, if you have a proxy or caching server, that server caches the pages that users request. Finally, the client browser caches pages. To make matters more complex, Netscape Navigator and IE behave differently with regard to caching.

Odd occurrences can happen when you don't manage caching. Users can be shopping on a system and lose items they've ordered, or items can reappear after they've deleted them. To further complicate matters, caching or proxy servers can interfere with dynamic content.

Caching has clear benefits—most notable, server can deliver pages much more quickly from a cache. However, dynamic Web pages work against caching technologies because pages can be different every time the server delivers them. Pages with stock quotes, rotating banner ads, and order forms are all examples of pages that need to be reloaded every time a user calls them.

The tools you have to work with to control caching are limited. A good article on the subject is Phil Paxton, "Cache No More" (http://www.learn asp.com/learn/cachenomore.asp), which suggests adding the code in Listing 1, page 8, to the top of your ASP scripts. The Response.Expires method causes the page to expire in 60 seconds. Don't use 0 because although it's supposed to work, it doesn't. Some developers set this value to a negative number, such as ­1, which also works. The command informs IIS not to cache the page.

The Response.Expiresabsolute method sets the page to expire 24 hours ago. This step is redundant because you used the Response.Expires method, but you're trying to make sure this page expires, so use everything you've got. You must place the Response statements before the <HTML> tag. Generally, place them as the first lines on any ASP file that uses them. For more information, see the Microsoft articles "How to Use Pragma: No-cache with IIS and IE" (http://support.microsoft .com/support/kb/articles/q165/1/50 .asp) and "Control Your Cache, Speed Up Your Site" (http://www.microsoft .com/technet/ie/technote/ie5cache.asp).

When I try to create a virtual Web site and point it to a mapped drive, the process fails. IIS wants a Uniform Naming Convention (UNC)-only pathname. Can I avoid using a UNC pathname?

IIS generally requires the use of a UNC pathname (e.g., \\servernameshare) when accessing a remote store of files. In some cases, you can map a pathname to a drive letter and use that drive letter to access the remote system. Microsoft discourages this mapping for a couple of reasons. The chief reason is that IIS stores drive mappings on a per-user basis as part of the user profile in the HKEY_ CURRENT_USERS registry key. As a result, if you log on to the IIS server as a different user than the one with which you created the mapping, the mapping won't be available, which is why virtual directories were created.

By using virtual directories to map to the network location, you can control the mappings and user account that you use to access the remote location. This mapping gives you a centralized point of administration. In contrast, having drive mappings that someone could change on the server could break your Web access. For more information, see the Microsoft article "Using Mapped Drives with IIS" (http://support.microsoft .com/support/kb/articles/q257/1/74 .asp).

Related Reading
Stress testing is an important part of tuning the performance of your servers. Here are a few resources that can help you:

  • Microsoft Web Application Stress tool (http://webtool.rte.microsoft.com)

  • Building a Windows 2000 Test Lab (http://www.microsoft.com/technet/
    win2000/dguide/chapt-4.asp)

  • Using Transaction Cost Analysis for Site Capacity Planning (http://www.microsoft.com/siteserver/ssrk/docs/
    rk_tcaplan.doc)

  • Windows 2000 Server Deployment and Planning Guide (from the Microsoft Windows 2000 Resource Kit)—http://www.microsoft.com/technet/win2000/
    dguide/default.asp

  • Excerpts from Patrick Killelea, Web Performance Tuning: Speeding Up the Web (O'Reilly & Associates, 1998)—http://patrick.net/index.html