Log Parser is a free command-line utility that harnesses the power of SQL querying and database mining to search through the log files that Windows Server 2003, Exchange Server, SQL Server, and ISA Server 2004 generate. If you're running Microsoft Internet Information Services (IIS) 6.0 on Windows 2003, you can use Log Parser to perform security auditing to monitor anomalies in the logs that might signal that an intruder is trying to compromise your Web server. Log Parser won't come straight out and tell you that an intruder has successfully attacked your server, but if you ask the right questions, it can provide you with some pretty conclusive evidence. I'll show you how to use Log Parser to generate statistics from IIS 6.0, check error status codes, locate unusual verbs in HTTP requests, and trace Web site activity.

Before we begin, you might want to read previously published Windows IT Security articles that will help you understand some of Log Parser's operating fundamentals. Although you don't need to read any of these articles to follow what I'm going to walk you through, you might find the background useful for a deeper understanding of Log Parser. In particular, see "LogParser," May 2004, InstantDoc ID 42174; "Access Denied: Using Log Parser to Audit Domain Logons," July 2004, InstantDoc ID 42812; "Targeting Failed Logons," September 2004, InstantDoc ID 43450; and "Filter for Security," October 2004, InstantDoc ID 43827. The ultimate resource for Log Parser is Gabriele Giuseppini and Mark Burnett, Microsoft Log Parser Toolkit (Syngress, 2005). Ready to go? Let's get started.

Generating Data
Before you can begin to analyze the IIS log files, you need to make sure you have the right data to analyze. To generate the appropriate data, configure the audit policy on your Web server for IIS logging. At a minimum, configure the policy to audit the following events for success and failure: logon, object access, and privilege use. Auditing logon events lets you see how accounts are being used to access information over the network via the Web server. Enabling auditing of object access on the appropriate folders of your Web site lets you build a record of access to the objects. Be careful about auditing object access: Overzealous auditing can lead to performance degradation due to server overload. Auditing privilege use can help you track down intruders who use privilege escalation to gain elevated access rights. I don't cover privilege escalation per se in this article, but you'll get an idea of how to use Log Parser to tease relevant privilege escalation details out of the audit log.

For IIS logging, select the W3C Extended log file format in IIS 6.0 and log as many fields as possible. Although Log Parser can use any of the IIS log formats, the W3C Extended log file format can record the greatest variety of information. Having the largest amount of information possible is useful when you need to look for unusual details as you investigate an auditing anomaly.

Configure logging by selecting the Enable Logging check box in the Web site properties. Use the drop-down menu to select the W3C Extended log file format, then click the Logging Properties button. On the Advanced tab, make sure that a check mark exists in each field's check box.

Log Parser Query Files and Output
To write Log Parser commands, I prefer to put the SQL queries inside a text file, then call that text file as an argument that uses the Log Parser executable. The advantage to this system is that it lets me save complex queries so that I can reuse them. Writing a query in Notepad rather than directly at the command prompt makes syntax corrections easier. To use a query file with Log Parser, use the following format, where query.sql contains your Log Parser SQL query:

logparser file:query.sql
  -o:DATAGRID -RTP:-1

I also prefer to output in a DataGrid, rather than directly to the command prompt. DataGrid formatting is helpful if you find command-line output confusing. The -RTP:-1 switch specifies that all output be displayed in the window, rather than only the first 10 rows of output, which is the default display limit. Figure 1 shows DataGrid output for a simple query.

Log Parser has many output options besides DataGrid. If the Microsoft Office Web Components are installed on your system, you can output Log Parser results to formats ranging from bar graphs to pie charts. In this article, I refer to DataGrid exclusively because I think it's the most accessible format for new Log Parser users.

Important Log File Fields
You need to be familiar with several important W3C Extended log file format fields when performing a security audit of your IIS logs. Although I recommend that you configure the IIS logs to record all possible data, the fields that Table 1 shows are the best entry points into the logs. Logging all possible data, even though you're likely to use only a few fields, is a simple basic precaution. Any field can contain something that turns out to be extremely useful in determining what an intruder has done when compromising your IIS server. You can find a full list of W3C Extended log file format fields by examining the extended logging options on the Advanced tab of the Logging Properties dialog in the Web site properties.

Pulling Basic Statistics from IIS 6.0
You can use Log Parser to quickly generate basic statistics for IIS 6.0 Web sites. The following query lists the targeted site's most-requested files:

---PAGEHITS.SQL---
SELECT cs-uri-stem, COUNT(*)
FROM <77038354>
GROUP BY cs-uri-stem
ORDER BY COUNT(*) DESC  
---PAGEHITS.SQL---

where <77038354> represents the site's unique identifier. The default Web site on IIS 6.0 always has an identifier of <1>. Unlike earlier versions of IIS, IIS 6.0 doesn't number additional sites sequentially, so you need to do some exploring to find a site's number if you aren't using the default Web site. If you run multiple Web sites, you can locate this number either by looking on the Microsoft Management Console (MMC) IIS Manager snap-in or by running the Iisweb /query command from the command prompt. With its powerful query engine and varied output formats, Log Parser is an excellent tool for performing detailed analyses of FTP, WWW, and SMTP traffic on your IIS server.

Finding Anomalies
One way to start looking for anomalies in your event logs is to check error status codes. Particular status codes, specifically those between 400 and 424, can indicate that some type of attack has occurred and that evidence of it exists within your Web server's logs. The following query will provide you with a list of error codes according to IP address:

---STATUSIPADDRESS.SQL---
SELECT c-ip, cs-uri-stem,
  sc-status, sc-substatus,
  COUNT(*)
from <77038354>
WHERE (sc-status BETWEEN 400
  and 424)
GROUP BY c-ip, sc-status,
  sc-substatus, cs-uri-stem
ORDER BY COUNT(*) DESC  
---STATUSIPADDRESS.SQL---

This query's output will provide a list of recorded errors between 400 (Bad Request) and 424 (Failed Dependency). These errors signal that something hasn't gone right, and although they don't necessarily mean that your server has been hacked, they do mean that you need to investigate the issue further. Once you have an idea of the frequency of unique error status codes, you can begin planning the best way to use Log Parser to drill down into the logs for more information.

Identifying Unusual HTTP Requests
You can often find evidence of attacks on a Web site by looking for unusual verbs in HTTP requests. In general, most legitimate HTTP requests include either POST or GET. The following script will output any verb that exists within the WC3 Extended log file format that isn't POST or GET, as well as the request retrieved from the Web server, the status code of the request, and the requesting client's IP address.

---UNUSUALVERBS.SQL---
SELECT c-ip, cs-method,
  cs-uri-stem, sc-status,
  sc-substatus, COUNT(*)
FROM <1>
WHERE (cs-method NOT IN
  ('POST';'GET'))
GROUP BY c-ip, cs-method,
  cs-uri-stem, sc-status,
  sc-substatus
ORDER BY COUNT(*) DESC  
---UNUSUALVERBS.SQL---

Unusual verbs can be innocuous, but they can also be an anomaly that requires further investigation. Multiple instances of the same verb from many IP addresses are more likely to be innocuous than are multiple instances of a verb from the same IP address. Multiple verbs from the same IP address that no other Web site visitors have used are a red flag that you shouldn't ignore.

Tracing Web Site Activity
If you detect an anomaly linked to a particular IP address, that IP address becomes a leverage point you can use to dig deeper into the logs. To do so, use a query similar to this one:

---TRACKIP.SQL-SELECT cs-username,
  cs-uri-stem, sc-status, sc-substatus FROM <1>
  WHERE c-ip = 'w.x.y.z' ---TRACKIP.SQL---

Before running the script, substitute the IP address you are interested in for w.x.y.z. Also make certain that the Web site identifier is correct so that you aren't looking at the wrong logs. The results of this output can give you further clues about what a suspicious visitor was doing when visiting your Web server.

Get to Work
I've only scratched the surface of Log Parser's ability to perform a security audit on IIS 6.0. If you've followed the examples and begun to put them to use on your Web server, you'll be able to generate your own queries to dig deeper into your logs, rooting out anomalies and putting a stop to suspicious activity.