Back up your network data in realtime

Protecting your data has become more challenging as a result of two trends: the drastic increase in the amount of data that you need to protect and the 24 x 7 schedule of most shops, which allows little or no time for offline backups. To address this situation, backup software vendors have been scrambling to develop software-based backup strategies and offer new solutions that leverage technological advances such as Storage Area Networks (SANs) and new tape technology. LiveVault (formerly Network Integrity) offers a unique solution, LiveVault 2.4, that lets administrators who have little or no window of offline time back up their network's data.

LiveVault markets its solution as a realtime backup and storage system for Windows 2000 and Windows NT servers. LiveVault claims to offer around-the-clock backup functionality with little impact on your available network bandwidth. In addition, LiveVault claims that its software's ability to record changes in data as they occur lets you restore the most recent versions of files, whereas traditional backup software lets you restore only the version that was saved at the last scheduled backup. LiveVault also includes a configurable, multi-tier archiving capability that lets you maintain multiple versions of files and store them on nearline devices or offsite tape sets for disaster recovery.

LiveVault's realtime-backup ability relies on the Microsoft file-system filter API, which is the same kernel-level file-system filter that several antivirus software applications use to monitor activity at the file-system level. As a result, LiveVault can have compatibility problems with some open-file managers, replication agents, quota-management utilities, and antivirus software. LiveVault provides compatibility information in the software's documentation and on the company's Web site.

LiveVault installs its Agent service on each client that uses the file-system filter. This service records and journals a file's bit-level changes as they occur, then sends the journal to the server, which reconciles the changes with the most recently backed-up version of the file. When a user changes a large amount of data in a file, LiveVault calculates the most efficient way to transfer the data to the server and might opt to send the entire file rather than the journaled changes. Similarly, if a user adds a new file to the client, LiveVault transfers the entire file to the server.

The LiveVault Backup Service, which runs on the server, reconciles and catalogs changes in client-side data and maintains versions of files according to user-defined retention schedules. This service relies on an automated three-tier storage system that reconciles changes to active files that are loaded in a disk cache. The server component then commits changed files to onsite tapes that LiveVault stores within a tape library or offsite for archiving and disaster recovery. The service also performs tasks such as tape-library management, backup policy maintenance, job logging and reporting, alert notification, and catalog maintenance.

To test LiveVault, I dedicated a Win2K Server system to host the LiveVault Backup Service and connected the server to a small network of six client machines running either Win2K or NT. The server was a Dell PowerEdge 4400 with dual 800MHz Xeon processors, 2GB of RAM, and a SCSI RAID controller with 10 Quantum 9.1GB 10,000rpm Ultra160 hard disks. I attached to this server an Advanced Digital Information Corporation (ADIC) FastStor DLT7000 tape library connected through an Adaptec AHA2940 SE SCSI host.

For data storage, three of the six client machines each used a SCSI adapter with one attached Seagate 9.1GB 10,000rpm Cheetah hard disk. Two of the six clients each used a 10GB Ultra ATA IDE hard disk, and the remaining client used four 10,000rpm Cheetah hard disks and acted as a Microsoft Exchange Server 5.5 system hosting a little more than 24GB in its information stores. At its center, the network used a 16-port 100Mbps switch, and each client connected at 100Mbps full duplex.

Per LiveVault's recommendation, I visited the company's Web site to find disk space requirements for the LiveVault Backup Service's disk cache. The site provides a disk-space calculator that asks for the total amount of data that you'll be backing up, the overall size of databases, and the size of the largest single database file. To be safe, I overestimated the total amount of data I'd be backing up, and the calculator returned a figure of 45GB. Thus armed, I configured the Dell server with a seven-disk 50GB RAID 5 volume for the disk cache, set aside one SCSI disk for the OS page file, and used the remaining two disks as a mirrored system drive. I then installed Win2K Server.

Before installing LiveVault on the server, I needed to configure the tape library. Although LiveVault is compatible with a long list of tape libraries, the software relies on native Win2K or NT vendor-supplied drivers. The LiveVault Backup Service doesn't offer any support for advanced library features (e.g., barcode readers). For LiveVault to recognize my server's ADIC FastStor tape library, I had to set the library to emulate an ADIC Scalar 448. In addition, I needed to remove all tapes from the library before I installed LiveVault so that the software's backup service could perform an initial inventory. After I completed these tasks, I began the LiveVault installation process.

LiveVault comes on one CD-ROM that contains the base code for the LiveVault Backup Service, Replica Service, Agent service, and Enterprise Monitor, a Web-based tool for monitoring multiple LiveVault servers. For each component that you've purchased, the company sends you a unique license code, which you enter during installation. LiveVault doesn't include tools for deploying LiveVault Agent. The company does provide Microsoft Systems Management Server (SMS) package definition files and tips on its Web site for scripting remote installations.

The LiveVault Backup Service requires a 450MHz Pentium II processor or better, 128MB or more of RAM, and ample disk storage for the disk cache. During testing, I discovered that LiveVault is hungry for both RAM and CPU cycles. I suggest a minimum of 256MB of RAM on the server on which you run LiveVault. In addition, multiple high-speed disks are a necessity in all but the smallest environments. LiveVault's technical support staff explained that one LiveVault server can support as much as 250GB of protected data from 64 Agent systems.

The installation process was fairly simple. I chose the custom installation option and installed the LiveVault Backup Service and the Management Console. Next, I chose where to put LiveVault's files. The LiveVault Backup Service generates a high disk-I/O load; as a result, LiveVault recommends that you keep the disk cache on separate disks from its journals and the proprietary database that LiveVault uses for its catalog. During setup, choose wisely where to put LiveVault's files; moving files later requires you to modify the system's registry. I put the disk cache on the 50GB RAID 5 partition, and I put the LiveVault program files, database, and journals on the mirrored system drive.

At the end of the setup process, LiveVault runs a Health Check utility that checks the system configuration for optimum performance. This utility recommended that I change the server's network settings to optimize throughput for network applications, which I did. Health Check also reviews several other setup items, including whether your server has a tape library. LiveVault saves the utility's comprehensive results to a log file in the LiveVault program files directory.

After installation, I checked for new services and found the LiveVault Backup Service and Rogue Wave Software's Noble-Net RPC TCP/IP portmapper. LiveVault uses only TCP/IP and requires a portmapper to allocate dedicated sockets for communication with Agent systems.

Installing LiveVault's Agent service on the other machines in my network was nearly identical to installing the LiveVault Backup Service on the server. One difference is that the Agent service doesn't require a disk cache. By default, setup places the database and journals on the drive that has the most available space.

I chose to install the Management Console on an Agent system to test the console's ability to remotely administer the LiveVault server. Because LiveVault operates through a file-system filter API, the software doesn't require a special Agent system for database servers. Thus, installing the Agent service on my Exchange Server system was identical to installing the service on the other clients in my network. However, LiveVault doesn't use Exchange APIs for information store backup, so the software doesn't truncate Exchange transaction logs. This limitation means that you must use circular logging to prevent transaction logs from overrunning the system's hard disk.

To test LiveVault's realtime backup capabilities, I needed to emulate a dynamic environment in which data continually changes and to evaluate how LiveVault performed at saving multiple versions of dynamic files over a period of time. To create this environment, I used a dedicated file server to run a script file that copied 15 groups of files (about 10MB each) to each Agent system at 20-minute intervals. At each interval, all the files were new, except for an updated version of a Microsoft Word document, which overwrote the previous version of the document to simulate changes users might typically make in a file over the course of a day. To add some randomness, I manually modified several specific Word documents throughout the test cycle, noting the changes I made to the documents.

To test how LiveVault backed up dynamic open files, I configured Performance Monitor to log data on each client throughout the 5-hour test cycle. On the Exchange Server system, I used the MailStorm utility to send mail to distribution lists (DLs) throughout the test cycle. With these test methods in place, I could restore an old version of a file from some point during the test cycle. I could then verify that the restored file reflected the changes that had been made up until that point in the test cycle or compare the restored file against changes in the current version. The results would prove whether LiveVault was performing as advertised.

At the server, I launched the LiveVault Management Console, which lists the Agent systems in the left pane and the Backup Service logs in the right pane, as Figure 1 shows. Using the console to configure my tests was intuitive, and the online Help was excellent. However, I experienced some problems initially. LiveVault uses a remote procedure call (RPC) process to discover Agent systems on the network. This process was frequently slow and didn't detect all the Agent systems until I prompted several refreshes. With the process this slow on my small single-domain network, I suspect the process might have similar or greater problems in larger networks. To work around this sluggish behavior, I manually entered into the console all the Agent systems on my network. If you use this solution, you need to shut off RPC discovery to avoid duplicate Agent entries.

To back up the Agent systems in my network, I had to create a policy for each machine. A wizard steps you through the policy creation process. Agent policies determine many aspects of LiveVault's behavior. Within a policy, you can configure characteristics such as file filters, bandwidth usage, registry and system-state backup intervals, database consistency detection, and the minimum frequency at which LiveVault saves open files to tape. (Figure 2 shows the bandwidth-usage options.)

You configure global settings, which affect how all policies function, at the server level. LiveVault's global settings include tape recycling and retention schedules and global file filters, which are useful for preventing backup of useless data (e.g., temporary directories). I used the default settings, which let each Agent consume all available bandwidth, for most of the policies. When the policy configurations were complete, the back-up jobs launched immediately.

LiveVault's first task was to synchronize the files on each client. This process uses the file-system filter to perform a block-level copy of the data you're protecting and send the data to the server. The Agent system captures and journals all changes to the data that occur during this process. After the copy process is complete, LiveVault uses the journal to update the copied data to a current state. This process can be a load on the network, so you might be wise to schedule the initial synchronization for off-peak hours.

For the first several hours of the synchronization process, the server showed a lot of activity, which I monitored through the server Management Console's Job Monitor object and the Tape Library Monitor, as Figure 3 shows. Alternatively, you can use alerts to keep tabs on LiveVault's activity. You can tell LiveVault to send SNMP traps or email, or use Net Send to notify you about specified conditions.

After the initial synchronization, I randomly restored a couple of files to check the integrity of the backed-up data and ensure that LiveVault had transferred the data correctly. I then began the first 5-hour test cycle and watched the activity.

Midway through the first cycle, I began receiving alert notifications that the tape library needed attention. This notification marked the first of many times that I needed to address tape library problems, such as cleaning the drive and inserting blank tapes. Although these tasks are fairly routine for tape-library maintenance, LiveVault doesn't provide functionality to make these tasks easier.

Also, LiveVault's attempts to make tape management ultrasimple results in limited flexibility in library management. As long as things are running smoothly and you follow the instruction of the software's wizards, tape management is a breeze. However, if you have preferences for how the software labels your tapes, which slots they occupy, or what data they contain, you're out of luck. This lack of flexibility became apparent when LiveVault attempted to label a bad tape. The tape wasn't usable, but LiveVault's catalog still recorded the tape as active. Consequently, LiveVault held up queued jobs, requesting that I insert the bad tape. Deleting the tape wasn't an option, and several attempts to mark it as lost caused the LiveVault Backup Service to hang. A few reboots later, I was finally able to mark the tape as lost and move on.

After the test cycle was complete, I continued to monitor LiveVault's activity while I tested restore jobs and the product's ability to save multiple file versions. The Restore Wizard simplifies the restoration process, but I was disappointed that LiveVault's catalog contained fewer file versions than I'd expected. After more testing and working with LiveVault's technical support staff, I discovered that the combination of one drive tape library, the large amount of data being backed up, and the policy settings (which were mostly default settings) were creating long queues for the tape drive. Although LiveVault did a good job of backing up the most recent versions of files to the disk cache, many versions that would ordinarily have gone to tape didn't because of the long queue. (The ability to prioritize queued items isn't a LiveVault feature.) Nonetheless, I was able to restore the few versions in the catalog without a problem.

My first attempt to restore the 16GB Exchange Server priv.edb file failed as a result of disk space constraints on the server. Initially, I had a hard time believing that I'd consumed 50GB of disk space, then I discovered that LiveVault reads restore jobs to disk cache before sending the jobs to the target machine. As a result, I was loading a 16GB file into disk cache alongside 24GB of current Exchange Server data and several gigabytes from the other Agent systems. To work around this problem, I disabled the policies for the Exchange Server system to let the disk cache clear, then I restored priv.edb without problems. I asked LiveVault's technical support staff about the disk cache sizing problem, and they acknowledged that the disk-space calculator has some inaccuracies and explained that the company's sales engineers usually discuss server sizing requirements with customers before a purchase.

After I modified the Exchange Server system's backup policies, I ran another set of tests in which LiveVault wrote backups to tape once per day rather than four times per day. This modification decreased the load sufficiently to let LiveVault save multiple document versions. I then performed restores of several document versions, including versions of Sysmon logs that had been open throughout the test cycle. Figure 4 shows the time-slice restore versions that the Restore Wizard offered. Each restore accurately reflected the correct version. I also experimented with bandwidth throttling on some of the Agent systems. Reviewing their Sysmon logs after a test cycle, I found that the Agent systems transferred data within the limits I set. Finally, I successfully restored the system state on two of my Win2K Agent systems.


Although I ran into a few glitches, LiveVault's core functionality met my expectations. By backing up in realtime the fairly small amount of data that actually changes, LiveVault takes a unique and promising approach to backups. Yet this product has a lot of room for improvement (e.g., more flexibility and functionality with tape-library management).

In addition, this product requires ample hardware headroom. An undersized disk cache or tape library will cripple performance. However, a properly scaled LiveVault installation will provide up-to-the-minute backups of your crucial data. If you need that type of functionality, LiveVault is unquestionably worth the price.

LiveVault 2.4
Contact: LiveVault * 508-460-6670
Price: $3600 per server running LiveVault Backup Service; $1300 to $2850 per client running LiveVault Agent
Decision Summary
Pros: Reliable; intuitive and easy-to-use UI; useful online Help and logging and alerting features; valuable fundamental capabilities
Cons: Inflexible tape-library management; lacks the facility to manage drive cleaning; unreliable discovery service that relies on Microsoft remote procedure call discovery; doesn't offer job prioritization