Simple Talk is now part of the Redgate Community hub - find out why

Automating the Retrieval of SQL Backups from Windows Azure BLOB Storage

If your database server is in Azure, then it makes sense to do backups into Azure too. SQL Server 2014
supports backups to the cloud, and particularly well with Managed Backup. Once your backups are safely in an Azure BLOB, then what? Mike Wood takes up the story.

SQL Server 2014 now provides you new ways to handle backups, especially when it comes to using the cloud as a storage location.  One feature enhancement is the ability to back up and restore using a URL endpoint, which will make storing your backups at various providers much easier.  To build on top of that feature they also offer Managed Backups which coordinates backups in Windows Azure.  You can do this from an on-premises server as well as from SQL Servers running on Windows Azure VMs in the cloud.  It’s a great way to keep a backup stored off site, or in another region for disaster recovery purposes. 

If you use Azure for your URL endpoint, or if you configure the Managed Backup feature, the result is that your backup is saved in Windows Azure BLOB storage (if you are unfamiliar with Windows Azure BLOB storage there is An Introduction to Windows Azure BLOB storage article on Simple Talk). There have been several blog posts and MSDN tutorials on getting a backup like this set up (see the summary below for some links), as well as how to restore these databases, but what if you just want to get a copy of that backup file? 

There are a variety of reasons why you may want to pull this file directly.  For example, the Managed Backup feature has a retention policy you can set, but it has a maximum of 30 days at the time this article was written.  You may need your backups for longer.  Another example would be to pull a copy as part of a continuous dev/test integration workflow.  Finally, you may wish to have a backup of the files locally for auditing or compliance reasons, or simply just another aspect of your disaster recovery plans.

No matter what the reason you need the file, rest assured there are multiple ways to retrieve it. 

Using Tools

When the SQL Server completes the backup operation the resulting file is stored in Windows Azure BLOB storage.  This means that we could get at that file with any of the tools useful for working with BLOB storage.  Below I’m using the free Cerebrata Azure Explorer to view a backup file. 


If you need to manage the backup files, or really any set of files, in your storage account a BLOB storage tool like this will make your life a lot easier, and almost all of them will let you download the files.  A tool like this works quite well if you need to browse your backups, sort out a specific problem, or deal with one off operations against a file; however, if you were going to be automating the regular retrieval of these backups then you’ll want to look at a scripting solution. 

Using PowerShell

Once a backup file is in BLOB storage it is really no different than any other file stored there.  We can use PowerShell to automate the process of pulling down our recent backups.  To get started you’ll want to install the Windows Azure PowerShell Cmdlets, which you can do using the Web Platform Installer. 

Once the Windows Azure PowerShell Cmdlets are installed the following script can be used to retrieve files from BLOB storage:

To execute this script update the variables at the top with your own values.  Use the same storage account name you used to set up your Managed Backup or credential in SQL Server.  Make sure to replace the angle brackets as well.  Provide the source container name of where the backup files live and an existing local destination you wish to pull the file to.   Note that the storageKeyFile.txt is storing the actual storage account key, but it does so in an encrypted form.  We’ll cover how to generate that file later in the article.

The script creates a Storage Context object using the account name and key, which is used to indicate what storage account we wish to access and to authenticate our requests.  The next step is we look at all BLOBs within the source container using the Get-AzureStorageBlob cmdlet which pulls back the list of BLOBs along with a little information about each.  This script then filters down that list to only look at those BLOBs that have a last modified time of greater than yesterday.  Finally that list is piped along to the Get-AzureStorageBlobContent cmdlet which will actually download the file.  The Force switch is used to overwrite existing files if they already exist.

The filtering by date is somewhat arbitrary, and in truth may result in missed files if the script fails to run one day.  It is better to either keep track of the last time the script run and use that specific date in the comparison, or determine the best way to decide which files you want to pull for your scenario. 

Now that the files are local you then incorporate them into any process you wish.

A Comment about Security

The script sample above stored the account key, which is a secret, into a key file.  If someone were to get a hold of the key they could do just about anything they wanted to with the storage account shy of deleting the account itself.  Obviously, just storing that key in a file doesn’t secure it, but the contents of that file are actually encrypted.

To generate the key file log in to the machine that will be executing this script and log in as the user account which will be performing the automated task, i.e., the one which will execute the scheduled task.   Run the following PowerShell and provide the account storage key when prompted.

The account key is accepted as a SecureString object and then written out to a key file using the ConvertFrom-SecureString Cmdlet.  Without a -Key parameter provided to this CmdLet, the Windows Data Protection API is used which means the string cannot be decrypted unless done so by the same user on the same machine.  By using this approach we do not need to embed the account key into the script.  If anyone got a hold of the script they would also need the key file and be able to execute the script as the account which originally encoded the key file.    It can be said that if someone already was able to access files owned by another user and execute operations as that user your ability to stop them is actually very limited.

Just like any script that contains secrets or needs to be secure, be aware of what is contained within the script and what could be done with that script if it were to be found by someone else. 


To summarize, if you back up your SQL Server database to Windows Azure BLOB storage you  have many options on managing those files, including the ability automate the retrieval of the files using PowerShell. 

To learn a little more about managed backups you can read Grant Fritchey’s blog post on How to Set up Managed Backups in SQL Server 2014, as well as the MSDN Documentation

How you log in to Simple Talk has changed

We now use Redgate ID (RGID). If you already have an RGID, we’ll try to match it to your account. If not, we’ll create one for you and connect it.

This won’t sign you up to anything or add you to any mailing lists. You can see our full privacy policy here.


Simple Talk now uses Redgate ID

If you already have a Redgate ID (RGID), sign in using your existing RGID credentials. If not, you can create one on the next screen.

This won’t sign you up to anything or add you to any mailing lists. You can see our full privacy policy here.