Transfer a large DB across slow WAN

Applicability:

SQL Server 2000: Tested

SQL Server 2005: Tested

SQL Server 2008: Not Tested

SQL Server 2008R2: Not Tested

SQL Server 2012: Not Tested

Credits:

Author: ChillyDBA

Date: 23 Jul 2010

Description

Occasionally, a DBA will be met with a challenge which on the surface seems straightforward, but turns into a seemingly never-ending process of obstacles to be overcome. I had one such experience several years ago when a request came in to refresh the Developmnet Data Warehouse from the Production version. Simple. Yeah, right......

    • The DEV server was located in the building

    • The PROD server was located 300 miles away in a managed hosting facility.

    • The only direct connection was 100Mb WAN

    • The PROD DB was 150+ GB

    • SQL Server 2005.

oh, and.....

    • The company was a financial institution and (rightly) concerned about securre data transfer, especially as the DW, but its very natrue, contained the full customer base and their financial records.

    • The company had a very strict set of processes and change control

    • The WAN was critical to internal business processes during the working day.

    • No direct connection to PROD SQL Server on port 1433 (or any other port) other than via a 2-hop Remote Desktop connection or through business applications.

So, the main problems were DB Size, security and network bandwidth. This is how we overcame them:

Compression:

We weren't yet able to utilise SQL 2008 as this had not been implemented in the production environment. Even installing a trial copy wasn't practical or possible due to change control procedures. The only option here was to use 3rd party backup software. Red-Gate SQL Backup Pro was already in use in the DEv environment but not in PROD as the hosting company had their own backup software. A licence was therefore purchased for PROD.

This reduced the 150GB DB to around 15GB of backup file

Security:

This was a breeze, as SQL Backup Pro had an encrypted backup facility

Network bandwidth

Crunch time. Even a 15GB file was quite substantial undertaking given the bandwidth restrictions.

Initial attempts to copy the file during the quiet hours met with failures. Mainly out of memory errors as the server didn't have enough spare capacity to transfer a file of that size - quite a feat considering the server had 64GB RAM, but that's another story.

I must admit that due to the lengthy process so far (almost 8 weeks to get the software installed on the PROD server and run the copy trials), that the task took on a life of its own. We even named it after an infamous carrier pigeon.

It seemed at that point that a network copy was out of the question. The next option considered was an external hard drive, with an option of either someone driving to get it, or of having it securely couriered. The latter option was prohibitively expensive ($5000) so a volunteer was identified. All we had to do was get the file copied to a USB hard drive. Easy!. Except that the data centre had operators and not technicians, so some very foolproof instructions had to be written on how to do the backup and file copy - memory fails me as to why we couldn't do this over the RDP, but try to imagine the most illogical reason and you'd probably come close. Anyway, after writing an SSIS package (complete with user input dialog box), we had a workable process that required nothing but a button push and single line data entry.

The story goes on, and on... including one attempt where the operator managed to break free of the foolproof instructions and failed to actually copy the file to the USB drive, resulting in an empty drive being retrieved. Doesn't matter how well instructions are written, someone will always manage to inpterpret them wrong. Writing user instructions can be a full-time job for an entire department in larger companies, and they still manage to get it wrong sometimes.

In the end though, we did get a copy of the DB to DEV, but it took almost 6 months.

Doh!

Sometimes, it really becomes apparent, mostly in hindsight, that you have been too close to a problem and should really have stood back and regrouped a little.

Shortly after we finally completed the task, it occurred to me that striping the backup might produce files that could be copied across the network. trials shoed that 1GB could easily be achieved, and would not jeopardise operational needs.

Striping the backup produced 32 files of around 600MB, meaning that with compression we had enough flexibility in the process to cope with almost 80% growth in source data before the copy became a problem again.

I therefore developed a set of code and instructions for future implementations (see below)

Of course, SQL 2008 and later now include backup compression and encryption, so the need for a 3rd party tool is removed, but the principles still remain valid.

Code

Secure Backup with striping and compression:

-- PARAMETERS

-- <DATABASENAME> - name of the single database to be backed up

-- <FILELOCATION> - name of the single database to be backed up

-- <PASSWORD> - replace this with the pasword to be used – must

-- comply with MDM security guidelines delivered as part of CR-112

-- <AUTO> - can be left untouched for an auto-generated file name or

-- replaced with a fixed file name

-- SERVER DETAILS

-- To be run in a SQL query window connected to the required

-- PRODUCTION SQL Server in Markham

DECLARE @exitcode int

DECLARE @sqlerrorcode int

EXECUTE master..sqlbackup N'-SQL

"BACKUP DATABASE [<DATABASENAME>] TO DISK = ''' + <FILELOCATION> + '\<AUTO>.sqb''

WITH COMPRESSION = 3,

FILECOUNT = 32,

COPY_ONLY,

KEYSIZE = 256,

PASSWORD = ''<PASSWORD>''"',

@exitcode OUT,

@sqlerrorcode OUT

IF (@exitcode >= 500) OR (@sqlerrorcode <> 0)

BEGIN

RAISERROR ('SQL Backup failed with exit code: %d SQL error code: %d', 16, 1, @exitcode, @sqlerrorcode)

END

--NOTE: Replace all <parameters> with operational data

-- **** Can only be run on servers with Red-Gate SQL Backup PRO

-- 6.2 or greater installed ***

-- **** This backup will not interfere with or change the

-- existing backup/log sequence ***

Restore from a striped, compressed and encrypted backup (Red-Gate SQL Backup 6.2):

-- PARAMETERS

-- <DATABASENAME> - name of the single database to be restored

-- <DB_DATA_FILE_LOGICAL_NAME> - name of the single database to be

-- restored

-- <DB_LOG_FILE_LOGICAL_NAME> - name of the single database to be

-- restored

-- <RESTORE_FILE_BASE_NAME> - the assigned base file name for the

-- 32-file backup set

-- <RESTORE_LOCATION> - The location where the backup files are

-- located

-- <PASSWORD> - replace this with the pasword to be used – must

-- comply with MDM security guidelines delivered as part

-- of CR-112

-- SERVER DETAILS

-- To be run in a SQL query window connected to the required

-- DEVELOPMENT SQL Server

DECLARE @exitcode int

DECLARE @sqlerrorcode int

EXECUTE master..sqlbackup

'-SQL "RESTORE DATABASE [<DATABASENAME>]

FROM

DISK = ''<RESTORE_LOCATION>\<RESTORE_FILE_BASE_NAME>_01.sqb'',

DISK = ''<RESTORE_LOCATION>\<RESTORE_FILE_BASE_NAME>_02.sqb'',

DISK = ''<RESTORE_LOCATION>\<RESTORE_FILE_BASE_NAME>_03.sqb'',

...

...

DISK = ''<RESTORE_LOCATION>\<RESTORE_FILE_BASE_NAME>_32.sqb''

WITH PASSWORD = ''<PASSWORD>'',

MOVE ''<DB_DATA_FILE_LOGICAL_NAME>'' TO ''<RESTORE_LOCATION>\<DATABASENAME>.mdf'',

... (repeat as necessary for multiple data files)

MOVE ''<DB_LOG_FILE_LOGICAL_NAME>'' TO ''<RESTORE_LOCATION>\<DATABASENAME>_log.ldf'',

... (repeat as necessary for multiple log files)

RECOVERY,

ORPHAN_CHECK"',

@exitcode OUT,

@sqlerrorcode OUT

IF (@exitcode >= 500) OR (@sqlerrorcode <> 0)

BEGIN

RAISERROR ('SQL Backup failed with exit code: %d SQL error code: %d', 16, 1, @exitcode, @sqlerrorcode)

END

--NOTE: Replace all <parameters> with operational data

-- **** Can only be run on servers with Red-Gate SQL Backup PRO

-- 6.2 or greater installed ***

-- **** This restored will not interfere with or change the

- existing backup/log sequence ***