\[Editor's Note: Share your NT discoveries, comments, problems, solutions, and experiences with products and reach out to other Windows NT Magazine readers (including Microsoft). Email your contributions (400 words or less) to email@example.com. Please include your phone number. We will edit submissions for style, grammar, and length. If we print your letter, you'll get $100.\]
Instead of using Server Manager or File Manager (winfile.exe) from my workstation to manage network shares on Windows NT file servers, I use the Microsoft Windows NT Server 4.0 Resource Kit's ShareUI utility. You can use this utility only if you're an administrator on the server or if you're a member of a group in the Administrators local group. The utility consists of shareui.inf and shareui.dll. Installing the utility is quick and easy; you simply right-click shareui.inf and select Install. The utility then creates a Shared Directories folder in the My Computer folder that lists all the local shares.
To manage remote shares, select Run from the Start menu, enter
in the Open box, and click OK or press Enter. A window opens that shows nonhidden shares, the Printers folder, and the Shared Directories folder. Double-click the Shared Directories folder to open a window that shows all shared directories, including hidden shares. From this window, double-click a share to view its properties. You can then change the share name or path. To manage the share's permissions, click Permissions. To create a new share, right-click in the window and select New, Share.
I'm a senior systems administrator for a company that builds custom document-management and imaging systems. I maintain and monitor a large network that has two Digital Equipment AlphaServer 4100's for document imaging and storage. Each system has dual 533MHz processors with 2GB of memory; combined, the systems provide 1TB of data storage. Both systems run Windows NT 4.0 with Service Pack 3 (SP3). The PDC runs two instances of Oracle8 and Computer Associates' ARCserve 6.5.
On January 25, I checked the systems and backups as usual and found no problems. I verified that my Oracle backups had run on the PDC over the weekend. I then opened the Date/Time applet in Control Panel to schedule a lunch meeting with a business associate for the following month. I selected February from the drop-down menu. I kept the Date/Time applet open for about 5 minutes while we agreed on a date. During our discussion, I heard the DLT mini-libraries loading and unloading tapes, but I didn't give this occurrence a second thought. I finished my conversation and clicked Cancel to close the Date/Time applet.
Soon after, my users called to tell me they couldn't connect to Oracle. I tried to open Oracle via SQL*Plus, to no avail. Upon further investigation, I found that both instances of Oracle were down.
My jukeboxes were still pounding away on tapes. But I didn't have any backups scheduled, so I checked ARCserve to see which backups were running. To my surprise, all the backups were running—including the Oracle backup, which closes the program before starting. I wasn't sure why the backups were running, but I knew Oracle was causing the problem. I stopped the Oracle backup and tried to restart the program. Neither of my Oracle instances would start.
Then it hit me—perhaps when I had the Date/Time applet open, the applet automatically changed the system date and time. I opened the applet and selected February from the drop-down menu. I left the applet open, without clicking OK, Cancel, or Apply. I then opened a command prompt and typed
Voila! The system date read February 25. When I had the Date/Time applet open earlier, the system thought the date was February 25. ARCserve noticed this date change and thought the system hadn't performed a backup in a month. Thus, Oracle shut down and proceeded with a backup, registering modification dates of February 25 on the database files. At first I thought this bug might be Alpha-specific. However, I tested the i86 NT 4.0 SP3 workstations that I support and found the same problem.
I had to do a cold restore to fix the problem and get my databases back online. I spent approximately 6 hours and lost about 3 hours' worth of user data. The moral of this story is that you shouldn't view the Date/Time applet's calendar on a mission-critical server to schedule a lunch date. I suggest that you use a paper calendar instead.
Renaming a Domain
In the January 1999 Reader to Reader, Haluk Yildirim discussed renaming a domain. He explained how to add a PDC from one domain to another domain by renaming the domain at the domain controllers (i.e., PDCs and BDCs). When I attempted this method, the installation process prompted me to specify the role the server would play (i.e., PDC, BDC, or member server). You must specify the first server as a PDC for the installation process to create a SID for the domain. If your second and consecutive servers are domain controllers (on the same domain), they must obtain their SIDs from the PDC running on that domain. Thus, all domain controllers have one SID. Renaming a domain doesn't change the SID; only reinstallation changes the SID. Therefore, I find it unlikely that you can add domain controllers to other domains simply by changing the domain name. I'm wondering whether other readers have attempted this procedure and whether it worked.
Update Your LMHOSTS and HOSTS Files Remotely
As an alternative to replication, you can use the Microsoft Windows NT Server 4.0 Resource Kit's robocopy.exe utility to update your LMHOSTS or HOSTS files remotely. First, create a local folder (e.g., Hosts) and share it as updates$. Next, copy your LMHOSTS and HOSTS files into the folder. Make your modifications. Then, create a text file (e.g., enterprise.txt) that includes all your servers' names. Finally, run the batch file in Listing 1.
Shorten Your Logon Time
When you log on, Windows NT automatically starts all the programs in your Startup folder. (To see which programs this folder includes on your system, select Programs, Startup from the Start menu.) Unfortunately, NT launches these programs concurrently, without waiting for one to finish before launching the next. If you have several programs in your Startup folder, you'll experience a lot of disk thrashing at logon. To decrease your logon time, replace the .lnk files in your Startup folder with the batch file in Listing 2. The Sleep command is a Microsoft Windows NT Server 4.0 Resource Kit utility that tells the system to wait for a specified number of seconds before launching an application. NT will then launch your programs sequentially, and you'll experience less disk activity and a quicker logon sequence than before.
Excluding Directories in Roaming Profiles
My company uses roaming profiles. To save space, we use system policies to remove profiles from the local machine when users log off. We wanted to reduce logon and logoff time and thus didn't want the Temporary Internet Files folder to copy to the server, but we couldn't figure out how to prevent this action.
The Microsoft article "How to Prevent Certain Folders from Uploading to Central Profile" (http://support.microsoft.com/ support/kb/articles/ q188/6/92.asp) explains that the HKEY_CURRENT_USER\Software\ Microsoft\WindowsNT\CurrentVersion\ Winlogon Registry entry sets the policy for copying the Temporary Internet Files folder. This Registry entry includes a REG_SZ value of ExcludeProfileDirs that lists the folders to exclude. However, the Registry entry didn't exist on our local machines.
I searched through the winnt.adm policy file and found that the line in the policy file that controls this Registry entry's location is HKEY_CURRENT_USER\Software\ Policies\Microsoft\Windows\System. Although this Registry entry existed on the workstations, and the entry included the ExcludeProfileDirs parameter that listed the Temporary Internet Files directory, the directory continued copying to the logon server.
I made a copy of the winnt.adm file, and I modified the file to reflect the HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\Winlogon Registry entry. Then, I used the policy editor to prevent the Temporary Internet Files folder from copying with the profile. After the Netlogon folder replicated, I logged on to verify that the Temporary Internet Files folder was present on the workstation but not on the server. When I logged off again, the folder still didn't copy to the server.
This fix works with any directory off the user profile root. The policy is available in Service Pack 4 (SP4), but I couldn't find it in the SP3 winnt.adm file.
Securing Public Workstations
I often think the Windows NT 4.0 community of developers has forgotten those of us who manage public machines and machines that untrusted users have access to. The problems don't lie specifically with the OS, because a knowledgeable NT administrator can, with a certain amount of granularity, adapt the OS to users' needs. For example, an administrator can place ACLs on every file, directory, and Registry entry.
The problem lies at the application level: Developers of NT products often seem unaware that nonadministrative users run their programs. Thus, administrators who manage public workstations must leave their machines less secure than they like so that poorly written programs can run properly. Developers need to think carefully about the privileges their programs require, and they need to write programs to carefully organize the information they store on disk and in the Registry and to fail gracefully when they encounter permissions problems.
UNIX has supported multiple users for a long time, and the UNIX community doesn't have the same problems as NT's development community has. Perhaps NT developers need to look at UNIX applications for tips. For example, if the UNIX text editor Editing MACroS (Emacs) required write access to the /usr/bin directory, most UNIX administrators would laugh and not install the program. However, Microsoft Office 97 requires access to important Windows directories such as C:\winnt and C:\winnt\system32—access that administrators would rather not give but that they must give for Office 97 to run. (For more information about this requirement, see the Microsoft article "OFF97: Security Requirements When Using NTFS Partitions" at http://support.microsoft.com/ support/kb/articles/q169/3/87.asp.)
To work around some of the problems involved in securing public machines, administrators can use the Microsoft Windows NT Server 4.0 Resource Kit Installation Monitor suite of programs to trace a program's execution and determine which files the program is failing to open (perhaps because of permissions-related problems). Although Installation Monitor is intended for installation programs (e.g., setup.exe), clever administrators can apply the utility to other types of programs. I've found the ability to trace programs invaluable in tracking down permissions-related problems on public machines.
Thin Is In
The July 1998 Lab Guys, "Thin Client or Fat Head?" prompted me to submit my 2 cents about thin clients. I work for MCI WorldCom and have been evaluating a thin-client solution for the company's telemarketers, who currently use Windows NT workstations with severely limited profiles and policies. The telemarketers don't do much CPU-intensive work, but they need modern machines to support the OS. In addition, the company typically leases machines and thus has a continuous influx of new technology to support. On the back end, the company has a minimum of three servers per domain (a PDC, a BDC, and an applications server) and uses Microsoft Systems Management Server (SMS) for remote management.
I believe thin clients are the solution for this type of environment, but I've been picky about the architecture I recommend. First, I'm recommending that the company purchase dedicated Windows terminals (e.g., NCD and Wyse machines) to use as thin clients rather than lease PCs. Dedicated terminals provide several benefits, including built-in remote management via the Windows CE kernel and special server software. These terminals let you easily upgrade the OS with simple utilities that are more reliable than SMS is. Thin-client machines are also relatively inexpensive and will be viable longer than PCs are (4 years as opposed to 2). Second, I'm attempting to use a pure RDP solution for cost-effectiveness and ease of use. Third, I'm considering Cubix load-balancing software to add server failover protection and to balance the load across multiple servers. On the back end, I'm recommending the Compaq ProLiant 6500 server with quad Xeon processors. This machine can handle more than 200 users with the company's typical software load. Thus, two or three servers are sufficient for the telemarketing center's user load. Fewer servers result in decreased costs.
The most important benefits that thin clients provide are ease of maintenance and ease of support. Thin clients are simple to maintain: If a terminal goes bad, you can replace it cheaply and easily. The lack of local software means no downtime to restore data. If you keep extra terminals on hand, you can reduce downtime from hours to mere minutes. Thin clients are simple to support. For example, I constantly have to update the unattended installation for the company's 10,000 remote PCs and configure SMS to upgrade software packages. The local operations staff must install packages that SMS pushes out to local PCs, a task that requires numerous hours of off-peak work. Thin clients help reduce this workload. In addition, thin clients decrease the number of highly technical onsite support staff we need. A knowledgeable technical staff performs routine server maintenance and helps local users solve software and hardware problems. By using thin clients, we could employ fewer technical people on site.
If I had my way, I'd put thin clients on the desktops, and I'd make most applications Web-based. This type of arrangement would make performance better on thin clients than on fat PCs, and it would simplify maintenance and architecture development.