Export from SQL Server to XLS and email results

Sometimes you want to take some query results and export them directly to an XLS file – here’s how you can set that up in SQL Server. The biggest caveat is that you need to run it from an x86 instance of SQL Server – the x64 instance won’t have access to the Jet driver needed to write the Excel file (Microsoft.Jet.OLEDB.4.0), where the x86 version will. In fact, we maintain an older x86 instance of SQL Server for random processes like this that need it – x64 is better in almost every case, but we can’t see to completely ditch x86… 🙂

I use a stored proc that I call from a SQL Agent Job, which works great. The actual process is a bit awkward – for starters, you’ll need access to xp_cmdshell. SQL Server can’t create a new Excel file from scratch, so you have to keep a blank Excel file around, make a copy of it, and then insert into the copy to get your final result.

That said, here’s the code to generate the XLS file from your query results:

SELECT Column1, Column2, Column3, Column4
  INTO ##YourTempTable
  FROM SomeOtherTable

SET @Folder = 'C:\Temp\'
SET @DocumentBlank = 'Your Document - Blank'
SET @DocumentLong = 'Your Document - ' + CONVERT(VARCHAR(10), GETDATE(), 120)

DECLARE @CMD NVARCHAR(4000)
SET @CMD = 'COPY "' + @folder + @DocumentBlank + '.xls" "' + @Folder + @DocumentLong + '.xls"'
exec master..xp_cmdshell @CMD

-- Export the Excel sheet
SET @CMD = 'insert into OPENROWSET(''Microsoft.Jet.OLEDB.4.0'',
	''Excel 8.0;Database=' + @Folder + @DocumentLong + '.xls;'',
	''SELECT * FROM [Sheet1$]'')
	select Column1, Column2, Column3, Column4 from ##YourTempTable'

exec sp_executesql @CMD

Once that’s exported, you can just set up the email process using sp_send_dbmail and attach the file you just generated:

DECLARE @Body VARCHAR(2000)

SET @Attachments = @Folder + @DocumentLong  + '.xls'
SET @Body = 'Your file has been generated for ' + CONVERT(VARCHAR(10), GETDATE(), 120)

exec msdb..sp_send_dbmail @profile_name = 'YourMailProfile',
	@Recipients = 'Recipients@YourDomain.biz',
	@subject = 'Your file is ready',
	@Body = @Body,
	@file_attachments = @DocumentLong

Moving a SQL Server database to another server on a schedule – without using replication

Recently, I had the need to copy a set of databases from a dozen remote servers to a central server, restore them, and have it happen automatically, with no intervention from me at all. Replication wouldn’t work for the following reasons:

  1. Many tables didn’t have primary keys, so merge replication was out (even though this was only one-way replication)
  2. The size of the databases (28GB in one instance) and the quality/speed of the WAN removed the log shipping option
  3. There’s too much activity to consider any kind of live replication

Given our restrictions, we decided to go the following route. On the remote server, we set up a batch file that did the following:

  1. Use OSQL to back up the databases in question to a folder
  2. Run 7Zip from the command line to compress the backups into separate archives. For each auto-attaching later, each archive had the name we wanted it attached to the remote server with (for example, Site1ProdDB was backed up to Site1ProdDB.BAK, then compressed to Site1ProdDB.7z)
  3. Delete the BAK files
  4. Archives were renamed from *.7z to *.7zz (this is important – I’ll explain why in the server part)
  5. Scripted FTP using Windows command line FTP tool to a folder on our central collection server
  6. Once the FTP was complete, rename the archives on the remote server back from *.7zz to *.7z
  7. Delete the local *.7zz files

That’s it for the client – the BAT file was scheduled as a SQL Agent job so that we could kick it off remotely from any site we wanted, or so we could set them up on a schedule. Then, we put a BAT file on the server that did the following:

  1. Check folder for files that match *.7z
  2. For each one found, do the following:
    1. Extract it to a “Staging” folder
    2. Delete the 7z file for that archive
    3. Use OSQL to restore the file from the command line
    4. Use OSQL to run a script that changes the DB owner, adds some user permissions, and generally does some housework on the database
    5. Use an SMTP tool to send a email notice that the backup has been restored
  3. Repeat step 2 for every .7z file in the folder
  4. As a second step in the SQL Agent job, run “MoveLog.bat” (included below) to finish rotating the logs – it ensures that only logs with meaningful information are kept

The server BAT process can run as often as desired – in our case, we run it every 30 minutes, so the backup will be picked up and restored as soon as it’s available. That’s where the rename from the client side comes into play: If the files were named Database.7z, then the server process would attempt to pick them up while they’re being uploaded via FTP, and shenanigans would ensue. By renaming them when they’re done uploading, they become immediately available for restoring on the server side.

As I said before, I scheduled both the client (source) and the server (restore/destination) process as SQL Agent jobs – the Windows scheduler is too cumbersome to work with remotely, and kicking them off on demand was a pain. With the SQL Agent, they can be started on demand, and then I get an email notification as soon as they’ve been successfully restored.

I’ve attached the files below, and I welcome any feedback that you have or any improvements that can be made – I’m happy to give you credit and post a new version here. Specifically, I’m interested in any feedback about how to make this process more dynamic – I know BAT scripting supports FOR EACH and wildcards, but I was unable to make them work properly with OSQL, so I’d appreciate any input there. Enjoy!

Download the ZIP archive containing the files for this post