Q: I can't connect to my SQL Server unless I first map a drive to the server. Is this problem unusual?
A: Chances are you're just having a problem with Windows NT security and authentication. Sometimes you need to log on and connect to an NT server before you can connect to the SQL Server. This procedure makes sense if you think of SQL Server as one of several resources available on the NT box.
Whether you need to log on to NT depends on which SQL Server Network Library (Net-Lib) you're using. The four most common Net-Libs are Named Pipes, Multiprotocol, TCP/IP Sockets, and NWLink IPX/SPX. The first two Net-Libs use interprocess communications (IPC) mechanisms that require you to log on to NT before you can access SQL Server; the second two socket-based Net-Libs do not require you to log on to NT. Now, we don't want you to construe this explanation to be a primer on NT user account security, but in general, you can connect to a SQL Server via Named Pipes or Multiprotocol Net-Libs if you can connect to a network share on the server.
Let's assume you're logged on to your local NT workstation, MyBox, using the SQLGuy local account. You don't have domain security in place, and you have SQL Server running on an NT server, DBBox. You can't connect to the SQL Server from MyBox unless you can first log on to the NT server DBBox. Logging on to the server requires having a SQLGuy account with matching passwords on both machines or enabling the Guest account on DBBox.
You can connect to the SQL Server if you first establish a connection to DBBox by logging on with a different set of credentials from those on the account you're using on the local machine. You can easily do so by mapping the network drive to any share on DBBox and specifying a valid username such as DBBox\Administrator in the Connect As text box. Now you can connect to SQL Server because you have an open network connection to the remote server running your database. Just remember that Named Pipes and Multiprotocol both require that the current user first log on to NT.
TCP and IPX, the two sockets Net-Libs, don't have the same restriction; both protocols let users connect to SQL Server without first logging on to NT. Connecting to SQL Server over a socket-based connection is like connecting to Internet Information Server (IIS) running on a public NT Internet server. By default, you don't need an NT account to connect to SQL Server because of the way you implement TCP/IP sockets.
To summarize, Named-Pipes and Multiprotocol Net-Libs both require you to log on to NT before connecting to SQL Server, but SQL Server doesn't care how you accomplish that logon. The TCP/IP and IPX/SPX socket-based Net-Libs do not require you to first logon to NT.
Be aware that, by default, SQL Server uses the Named Pipes protocol. To change the default for your client machine, use the Client Configuration utility that loads when you install the SQL Server tools and utilities on your machine.
Q: I'd like to monitor my transaction logs and take corrective action before they fill up, but I don't want to use Windows NT Performance Monitor (Perfmon). Is there any way to track log space used directly from SQL Server?
DBCC PERFLOG is an undocumented command that will give you the information you're looking for. The problem is you must capture the output so that you can monitor it in a Transact-SQL (T-SQL) batch. SQL Server 6.5 extends the INSERT statement in an interesting way that lets you capture the output easily. First, look at the simple output from DBCC PERFLOG shown in Table 1.
Ahh, so near and yet so far. All the necessary information is right there, but SQL Server doesn't offer a direct way to extract the Log Size and Log Space Used information. The task would be simple if Database Consistency Checker (DBCC) provided a direct way to populate a table with information from PERFLOG (do you hear us, Microsoft?). Fortunately, however, we can populate the table by taking advantage of SQL Server 6.5's ability to insert the result set of a stored procedure into a table as illustrated in the following:
The process is simple. First, we create a wrapper procedure to run DBCC PERFLOG. And although we can't insert DBCC PERFLOG output directly into our tracking table, we insert the result set from a stored procedure that just happens to call our elusive PERFLOG comment.
You can add an extra column called PerfLogTime and populate it with the current system time to make the information useful for trend-tracking. Once the information is in the table, you can use regular SQL statements to retrieve the Log Size and Utilization information and take whatever action you want (e.g., set up an alert).
Inserting results from a stored procedure into a table is a powerful technique that you can use in many situations, including with other DBCC commands. Let us know if you come up with an interesting use you'd like to share with other readers.
Q: Do I need to worry about how SQL Server handles the Year 2000 (Y2K)?
A: SQL Server has a strange view of centuries. When you use a two-digit format for the year, SQL Server interprets dates from 01/01/50 to 12/31/99 as the twentieth century and dates from 01/01/00 to 12/31/49 as the twenty-first century, regardless of the current date. You probably already know that, but do your developers and commercial applications built on SQL Server?
To verify Y2K compliance, you can search for several patterns in Transact SQL (T-SQL) code (which is often embedded in front-end applications) and SQL Server tables, and change the year format from two digits to four digits. The patterns are
* Variables or constants hard-coded with two-digit years. Remember to search on all possible SQL Server date formats, such as MM/DD/YY; M/D/YY; MM-DD-YY; Mon DD, YY; DD-Mon-YY.
* Tables with char and varchar columns that store dates in two-digit format. Unfortunately, changing the format will probably require changing the column length.
* Occurrences of convert(datatype, expression, format_style), where datatype is datetime, smalldatetime, char(n), or varchar(n) and expression is of type datetime or smalldatetime or is a datetime value in a character form. Smalldattime is a less precise datatype than datetime and has a storage size of 4 bytes: one small integer for the number of days after January 1, 1900, and one small integer for the number of minutes past midnight. Data values for smalldatetime range from January 1, 1900, through June 6, 2079, with accuracy to the minute. Replace smalldatetime with the 8-byte datetime, whose valid data values range from January 1, 1753, to December 31, 9999, with accuracy of 0.03 seconds, or 3.33 milliseconds.
* bcp load files. Make sure they have four-digit years.
And of course, test, test, test.
Q: In your July column, you explain how to move a database from one server to another, but you don't describe how to move the passwords. How can I protect my data?
A: Check out Brian's May article, "How Safe Are Your SQL Server Passwords?" and the tip in our August column about moving logons from one server to another. The problem is that SQL Enterprise Manager's transfer database option doesn't let you move logons from one server to another because it transfers logon passwords as NULL. Obviously, the inability to transfer passwords not only is annoying, but also represents a potential security hole because accounts on the new server won't have any password protection. To circumvent this problem, you can grab the encrypted passwords from one server and update the password column in syslogins on another server, as shown in Listing 1. Sp_TransferPasswords is a simple stored procedure (no error checking, use at your own risk) that copies encrypted passwords from one server to another. (My tests are limited to similar servers--same version, sort order, character set, and hardware platform. The procedure pwdencrypt might use different algorithms on different server types, so you're on you own if your servers are dissimilar.)
To use sp_TransferPasswords, you must configure the two servers as remote servers for each other and then run the stored procedure on the target server where you want to update password information. The name of the source server containing the correct passwords is a mandatory parameter.
Q: Is there an undocumented RegistryRebuild option? If so, what does it do?
A: Knowledge Base article Q157805 (http://www.microsoft.com/kb/default.asp) states BUG #15944, "The RegistryRebuild option of SQL Server Setup is no longer documented." This article shows how the option can save you a complete reinstallation if you have damaged, restored, or inadvertently modified the Windows NT Registry. You might need to rebuild the Registry for SQL Server if
* You recently restored the system Registry after a failure and the key for SQL Server does not exist
* You reinstalled NT and SQL Server already exists
* You restored SQL Server from a backup after you reinstalled NT
* You received NT error 1060 when you try to start SQL Server
Setup won't go through an entire install, but will rebuild the necessary Registry keys if you type
Setup /t RegistryRebuild = On
Make sure you type the setup command exactly as shown. Otherwise, the regular setup program might run instead and overwrite your existing master database. Be aware that this example assumes you've installed SQL Server at c:\mssql. Otherwise, you'll need to change the path.
Q: I'm trying to integrate Visual SourceSafe and SQL. I have numerous Web servers throughout the country that use Cold Fusion to access SQL tables, stored procedures, triggers, and so on. Each Web server contains the same SQL objects and HTML pages, but they all contain different (site-specific) data within the SQL tables. I have defined all the HTML pages to SourceSafe, and I can easily copy updates to the various servers with a Get Current command.
What's the best way to get updated SQL objects to the various servers? I don't think I can use SQL replication because the tables contain different data. Also, the servers are connected over a WAN, and I want to be able to control when updates occur. Currently, I use object transfers within SQL to get the objects to the various servers, but I have no history of changes (one of the beauties of SourceSafe), and I have no way to easily compare the two servers to ensure that they're identical.
One method I used involved generating SQL scripts and storing those within SourceSafe, updating the scripts on the target servers, and then regenerating the various objects on the target servers from the updated scripts. This method seems to work, but my developers complain about the inconvenience. Do you have any better ideas?
A: If there's a better way, we don't know about it. Some objects, such as triggers and stored procedures, are fairly easy to deal with because you can store the scripts in a version control system. You can also store your table definitions. The real problem is how to regenerate downstream tables without losing your data.
As you know, your solution is tedious and too manual. Has anyone discovered a better solution? Let us know, because this process is a big issue that many people struggle with.
Cool Utilities in the BackOffice Resource Kit
We've recommended Microsoft BackOffice Resource Kit, Part 2 (http://mspress.microsoft.com/mspress/Books/Des/1264.htm) a couple of times, but we haven't gone into detail. This resource kit includes the Microsoft SQL Server Resource Guide, which lists the eight Microsoft SQL Server Open Database Connectivity (ODBC) drivers, and a CD-ROM packed with the following cool utilities:
Filltabl. Filltabl is a DB-Library-based Win32 console utility that you can use to generate a large amount of data for a table. This useful utility knows how to query the system catalog tables to determine the structure of the underlying table, then generate pseudo-random data that loads via bcp. One caveat is that the utility might generate duplicate data; in other words, you might not be able to load data into a table with a unique index. That problem is serious, but Microsoft provides the source code so that you can enhance it to fit your needs. Naturally, if you do a good job fixing the problem, let us know and we'll post your code.
Sp_help_set_options. SQL Server 6.5 introduced @@options, a new global variable that lets you determine which Set options are in effect for a given connection. Unfortunately, @@options is a single-integer value that represents a bit-mask representation of the current Set options. The value of @@options is ugly to look at. Sp_help_set_options unscrambles the bit mask and explains in plain English which options are set.
Sp_help_sourcetext. This procedure deals with the formatting problems that occur if you try to use sp_helptext to print out the text associated with a stored procedure. Sp_help_sourcetext is the new, improved version of sp_helptext. It lets you print the underlying object definition of views, triggers, procedures, rules, and defaults in an easy-to-read format--similar to what you see when you view an object definition in SQL Enterprise Manager.
Sp_drop_fkeys. This nifty procedure drops all foreign key constraints for a given table. This utility can be a big time-saver in a development environment if you're regularly dropping and re-creating entire databases, because as you well know, you can't drop a table that has a referencing foreign key.
SQL Load Simulator. This utility is one of the best on the BackOffice resource kit. It's a 32-bit C program that uses up to 64 concurrent threads to run a Transact-SQL (T-SQL) file so that you can simulate up to 64 ODBC clients from one workstation. This simulation is a great way to build simple drivers for stress testing your application. Build a T-SQL file using SQLTrace; then stress the system with the new LoadSim.
SQL Server Object Comparison. This tool does a passable job of comparing databases or individual objects to see whether their structures match. It's not as powerful as some utilities from third parties, but, hey, it's free!
SQL Trend Analyzer. SQL Trend Analyzer is a series of scripts and Excel workbooks that put a nice GUI wrapper around SQLTrace results. SQLTrace is an incredibly valuable utility that ships with SQL Server. It lets you capture the T-SQL stream your clients are passing to SQL Servers. The trend analyzer lets you view the output with some predefined Excel charts and reports that can make finding performance bottlenecks easier for the novice network tuner.