Moving a SQL Server database to another server on a schedule – without using replication

Recently, I had the need to copy a set of databases from a dozen remote servers to a central server, restore them, and have it happen automatically, with no intervention from me at all. Replication wouldn’t work for the following reasons:

  1. Many tables didn’t have primary keys, so merge replication was out (even though this was only one-way replication)
  2. The size of the databases (28GB in one instance) and the quality/speed of the WAN removed the log shipping option
  3. There’s too much activity to consider any kind of live replication

Given our restrictions, we decided to go the following route. On the remote server, we set up a batch file that did the following:

  1. Use OSQL to back up the databases in question to a folder
  2. Run 7Zip from the command line to compress the backups into separate archives. For each auto-attaching later, each archive had the name we wanted it attached to the remote server with (for example, Site1ProdDB was backed up to Site1ProdDB.BAK, then compressed to Site1ProdDB.7z)
  3. Delete the BAK files
  4. Archives were renamed from *.7z to *.7zz (this is important – I’ll explain why in the server part)
  5. Scripted FTP using Windows command line FTP tool to a folder on our central collection server
  6. Once the FTP was complete, rename the archives on the remote server back from *.7zz to *.7z
  7. Delete the local *.7zz files

That’s it for the client – the BAT file was scheduled as a SQL Agent job so that we could kick it off remotely from any site we wanted, or so we could set them up on a schedule. Then, we put a BAT file on the server that did the following:

  1. Check folder for files that match *.7z
  2. For each one found, do the following:
    1. Extract it to a “Staging” folder
    2. Delete the 7z file for that archive
    3. Use OSQL to restore the file from the command line
    4. Use OSQL to run a script that changes the DB owner, adds some user permissions, and generally does some housework on the database
    5. Use an SMTP tool to send a email notice that the backup has been restored
  3. Repeat step 2 for every .7z file in the folder
  4. As a second step in the SQL Agent job, run “MoveLog.bat” (included below) to finish rotating the logs – it ensures that only logs with meaningful information are kept

The server BAT process can run as often as desired – in our case, we run it every 30 minutes, so the backup will be picked up and restored as soon as it’s available. That’s where the rename from the client side comes into play: If the files were named Database.7z, then the server process would attempt to pick them up while they’re being uploaded via FTP, and shenanigans would ensue. By renaming them when they’re done uploading, they become immediately available for restoring on the server side.

As I said before, I scheduled both the client (source) and the server (restore/destination) process as SQL Agent jobs – the Windows scheduler is too cumbersome to work with remotely, and kicking them off on demand was a pain. With the SQL Agent, they can be started on demand, and then I get an email notification as soon as they’ve been successfully restored.

I’ve attached the files below, and I welcome any feedback that you have or any improvements that can be made – I’m happy to give you credit and post a new version here. Specifically, I’m interested in any feedback about how to make this process more dynamic – I know BAT scripting supports FOR EACH and wildcards, but I was unable to make them work properly with OSQL, so I’d appreciate any input there. Enjoy!

Download the ZIP archive containing the files for this post

5 thoughts on “Moving a SQL Server database to another server on a schedule – without using replication”

  1. Hi, this is a great article. But may I ask you a question? Since you have the experience you may have the answer for me.

    We have similar situation but we don’t need to copy the whole database. We only need to copy certain tables’ data back and forth between the client and master database. So do you have any suggestions how to approach?

    Thanks a lot if you can help me out!

    Hannah

    1. Though the process I’ve written here takes a backup of the entire database, it could be modified to export a particular table (or take any other steps, I suppose):

      1. Instead of taking a backup of the database, execute the necessary T-SQL to export your table to a CSV file. Repeat for each table you want to ship.
      2. Zip the files as I do in this process and ship them to your remote server
      3. Alter the process on the destination server to extract and import the files, rather than restoring the database.

      If you need some help modifying the process, I can put some steps together and update this article with details of a modified process, but it should be possible to do what you’re looking for.

      This process essentially recreates snapshot replication, so that’s obviously easier to use, but in cases where you can’t use replication for some reason, this will meet the need.

  2. Hi Ryan,

    Thanks a lot for your reply.

    Here’s my requirements:

    We have many sites (above 70, will grow going forward) which will need to work offline sometime (no internet, no network). So we need to install some sort of local lightweight databases to these computers for them to easily work offline. These databases will only have a few tables that they need. They need to read and write the data to the databases daily offline.

    We also have a master SQL Server 2008 database that needs to sync the data with the above client databases daily when they are back to network.

    So the process would be something like: (table1 and table2 in client and master DB are the same structure)

    • Client DB: export table1 to a file1 in the folder named “Client” from the shared network drive; then delete all of data from table1 and table2; then import data to table2 from a file2 in the folder named “Server” from the shared network drive.

    • Master DB: export table2 data to a file2 in the folder named “Server” from the shared network drive; import table1 data from file1 which is in folder “Client” of the network drive.

    So my questions are:
    1. What kind of a lightweight database we should install for the client machines? SQL Server Express/CE/Lite or something else? So that we can sync the client database and master database easily.

    2. How should we sync the client and master databases data daily automatically? In other words, how can we implement the process above? I know these database don’t have SQL Server Agent service to schedule jobs to run daily.

    I did some research online, and found out maybe linked servers or SSIS package may work for my situation, but I think your approach is the best that I found so far. The only thing is that the client lightweight database may not be able to set up automatic jobs to run to export the data to a CSV file.

    So what do you think how to solve my situation? Thanks again for your help!

    Hannah

  3. Hi Ryan,

    I am not able to access your website and also i am not able to open the hyperlink given in your blog

    1. Thanks for letting me know! I recently moved to a new host and one of my domains wasn’t set up correctly, causing many of the links to be broken. The link to the Microsoft documentation on command line FTP had moved as well, so I’ve also updated that.

      I appreciate your help.

Leave a Reply to Ryan McCauley Cancel reply

Your email address will not be published. Required fields are marked *

Why ask?