SQL 2014 Storage Solutions – Azure Blob Storage

SQL Server 2014 has support for both on-prem and cloud database files.  This opens up a series of new possibilities which assuredly will spark debate.  Here we will dig into a few of the new scenarios and potential use cases. 

SQL Server running in Azure – Files in Azure Blob Storage

The scenario where you have your SQL Server running in Azure is likely the most common, or at least the first scenario that comes to mind for this feature.  Running your database with files in Azure Blob Storage provides you with the kind of dynamic disk allocation that suits agile development and unknown growth patterns.  Leaving the storage maintenance concerns to Azure leaves IT and development the time to focus on their applications, which is where most developers would choose to focus their efforts. 

SQL Server running On-Prem – Files in Azure Blob Storage

Running a database on-prem with files in Azure Blob Storage has the same benefits of managed storage, but also enables additional HA capabilities.  For example, if your server running your database happens to crash, the database files in Azure are accessible from anywhere (potentially) and the database can be attached to a new instance – on-prem or in Azure.  Additionally, keeping all of the data and log files in Azure Blob Storage enables Always-On Availability Groups to transition between on-prem and Azure. 

SQL Server running On-Prem – Hybrid (Data in Azure, Log On-Prem)

This scenario seems to be about tiered data storage.  The setup here is an on-prem database, with an on-prem log, and data in Azure Blob Storage.  The tiered data storage piece comes into play here because there would more than likely also be data on-prem.  The primary reason that there are files in Azure is that they are accessed infrequently and can handle being somewhat latent.  An example of this is a partitioned table, whereby old partitions that are not written to and rarely read are moved into a filegroup in Azure.  The active partition and log remains on-prem to maintain performance of reads and writes. 

One problem with this scenario is that loss of connectivity to the files in Azure will leave the database In Recovery. 

Creating Database Files in Azure Blob Storage

The setup here is similar to backing up a SQL Server database to Azure Blob Storage step 1 – 3. 

Setting up your storage

Step 1: Create account

Creating an Azure account is quick and easy, and there is currently a 1 month trial available at http://www.windowsazure.com.  Once you request your account, it may take 24 hours for it to be provisioned.  Keep this in mind as you will not be able to begin testing immediately.

Step 2: Add storage

Now that you have your account, you will need to add your storage.  This is quick and easy, with the only steps being to name your storage, pick a location, and choose whether your storage should be geo-redundant or locally redundant.  While backup files can be geo-redundant, data and log files do not currently support this.

Create Storage Account

Step 3: Create container

In order to save any files into your storage, you will need to add a container.  To do this, click on the name of your newly created storage.  From here, you can monitor and change the configuration of your storage.  Click on the navigation for “Containers”.  In the “New Container” dialog, pick a name for what files will be in the container.  When we set up our container for backups previously, we called our container “sqlbackups”.  For our purposes here, we can just call this “data”.  You also have the choice of access types – generally speaking you will leave this at the default of “Private”. 

Now that you have a container created, you will see that the container has a URL.  This is the URL that you will use to point your backups to this container.  The format of this URL is [storage name].core.windows.net/[container].  While the URL shows http://, https:// will also work. 

Storage Container - Data

 

Creating the database with files in the cloud

Step 1:  Creating a credential

As with backing up a database to Azure Blob Storage, a credential needs to be in place for accessing your cloud resources.  However, this credential uses different inputs, which was not very intuitive to me.  Notes on this credential:

1.      The name of the credential must me the URL of the container (you would need 2 credentials if you had a data container and a log container)

2.      The IDENTITY must be set to “SHARED ACCESS SIGNATURE”

3.      The SECRET is not one of the 2 automatically generated access keys in the storage account as it would be for creating a backup credential.  Rather, it is generated outside of the Azure portal using the Azure API.  For this, you will either need to code up some c# to generate a secret, or use one of the Storage Explorers to generate a secret.  If you want to write the code, head over here to see how.  Otherwise, here is how to use a storage explorer:

First, you need to get your hands on a storage explorer.  From your storage account quick start guide, click on Storage Explorers.  In this example I am using the Azure Storage Explorer, which is out on CodePlex.

Add account

StorageExplorer Add Account

Click on the container you will put your files in (“data” in this case), and then click “Security”.  On the “Shared Access Signature” tab, we will generate a secret to use in the credential.

generate shared access signature

Make sure that you check the box for “write”, otherwise your credential will not have access to create the files in your container.  When you click “Generate Signature”, you will get a string that looks like this:

https://blogdemostorage.blob.core.windows.net/data?st=2014-01-13T03%3A16%3A31Z&se=2020-01-13T04%3A16%3A31Z&sr=c&sp=rw&sig=H6c7exHNUpVMY63bBh4nE5Epve5g6oP1xOkW0ybjQTQ%3D

The section highlighted is what we will use for the SECRET in the credential.  You can you everything from “st=” to the end, but that will put a time based restriction on the access.

Now that we have generated a new SECRET, let’s create that credential!

USE MASTER
GO
CREATE CREDENTIAL [https://blogdemostorage.blob.core.windows.net/data]
WITH
	IDENTITY = 'SHARED ACCESS SIGNATURE',
	SECRET = 'sr=c&sp=rw&sig=H6c7exHNUpVMY63bBh4nE5Epve5g6oP1xOkW0ybjQTQ%3D'
GO

Step 2:  Create the database with files in Azure Blob Storage

All Cloud files

CREATE DATABASE [Test_CloudFiles]
	CONTAINMENT = NONE
ON PRIMARY
(
	NAME = N'Test_CloudFiles',
	FILENAME = N'https://blogdemostorage.blob.core.windows.net/data/Test_CloudFiles.mdf',
	SIZE = 4096KB,
	FILEGROWTH = 1024KB
)
LOG ON
(
	NAME = N'Test_CloudFiles_log',
	FILENAME = N'https://blogdemostorage.blob.core.windows.net/data/Test_CloudFiles_log.ldf',
	SIZE = 1024KB,
	FILEGROWTH = 10%
)
GO

On-Prem log, cloud data

CREATE DATABASE [Test_HybridFiles]
	CONTAINMENT = NONE
ON PRIMARY
(
	NAME = N'Test_HybridFiles',
	FILENAME = N'https://blogdemostorage.blob.core.windows.net/data/Test_HybridFiles.mdf',
	SIZE = 4096KB,
	FILEGROWTH = 1024KB
)
LOG ON
(
	NAME = N'Test_HybridFiles_log',
	FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\Test_HybridFiles_log.ldf',
	SIZE = 1024KB,
	FILEGROWTH = 10%
)
GO

Testing the performance of the databases

For a quick performance test, I threw together a script that sequentially inserts 250 integers into a table. 

	create table dbo.test
		(
		id int primary key clustered
		)

	declare @x int = 0

	while (@x < 250)
	begin
		insert into dbo.test(id)
		values(@x)

		set @x = @x + 1
	end

I then ran this script against each database.

On-Prem Data / On-Prem Log  – 0 seconds

Cloud Data / On-Prem Log – 0 seconds

Cloud Data / Cloud Log – 18 seconds

Even on my internet connection, I expected this to be pretty snappy in each scenario.  When working with Azure Blob Storage, it does take some usage to get the disks hot.  It appears that the local log brings the performance back in line, even if your only data file is in Azure Blob Storage.  Presumably this is due to the log buffer flush to the transaction log occurring local.  When I have a bit more time perhaps I will research whether Log Writer waits are occurring when the log is in Blob Storage. 

Using Azure Blob Storage for SQL Server Backups

With the release of SQL Server 2012 CU4, backing up databases directly to Azure Blog Storage is presented as an option for managing backups.  This provides a great opportunity to leverage the cloud for off-site, geo-redundant data storage for disaster recovery that is available from any location.  This article is an overview of how to get started using the cloud as a part of your DR plan. 

Setting up your storage

Step 1: Create account

Creating an Azure account is quick and easy, and there is currently a 1 month trial available at http://www.windowsazure.com.  Once you request your account, it may take 24 hours for it to be provisioned.  Keep this in mind as you will not be able to begin testing immediately.

Step 2: Add storage

Now that you have your account, you will need to add your storage.  This is quick and easy, with the only steps being to name your storage, pick a location, and choose whether your storage should be geo-redundant or locally redundant.

Create Storage Account

Step 3: Create container

In order to save any files into your storage, you will need to add a container.  To do this, click on the name of your newly created storage.  From here, you can monitor and change the configuration of your storage.  Click on the navigation for “Containers”.  In the “New Container” dialog, pick a name for what files will be in the container.  For our purposes here, we can just call this “sqlbackups”.  You also have the choice of access types – generally speaking you will leave this at the default of “Private”. 

Now that you have a container created, you will see that the container has a URL.  This is the URL that you will use to point your backups to this container.  The format of this URL is [storage name].core.windows.net/[container].  While the URL shows http://, https:// will also work. 

Storage Container

 

Step 4: Manage access keys – created by default

When you are in your storage account on the dashboard, you will notice “Manage Access Keys” at the bottom of the screen.  Clicking this will take you to a screen with 2 access keys pre-generated for your storage account.  You will use these to create a credential in the next step.

manage access keys

Creating your backup

Step 1: Create credential

In order to perform a backup to the cloud storage, you will need to create a credential on the SQL Server instance that you wish to take backups from.  The identity of the credential is the storage name from Azure, and the secret is the access key from step 5 above.

use master
go

CREATE CREDENTIAL BackupCredential WITH IDENTITY = 'blogdemostorage'
,SECRET = 'i/NCk4dWxUce0U+w5kPFT9m9p2UkT5ogWFYdyKEOqRNjLt32DFuu8NO7dMa6oNMqi/EZe2k+F/SuK9EbSkY1AA==' ;

go

Step 2: Backup the db

With your storage account and credential created, all that is left is to issue a backup command.  The biggest difference here is that instead of backing up TO DISK, you are backing up TO URL.  The URL specified in the command is the URL that is associated with the container that you created, with the name of the database backup file appended to it.  Authentication is handled by the credential.  Since you are transmitting this file to the cloud, it is recommended that you perform your backup with compression. 

BACKUP DATABASE AdventureWorks2012
TO URL = 'https://blogdemostorage.blob.core.windows.net/sqlbackups/AdventureWorks2012.bak'
      WITH CREDENTIAL = 'BackupCredential'
     ,COMPRESSION

After a refresh of your container in Azure, your backup file will be there.  Success!

You may choose to perform a backup using the SSMS GUI.  While backup TO URL exists in SQL Server 2012 SP1 CU4 and beyond, the ability to perform a backup in this manner through the GUI does not exist prior to SQL Server 2014.  After choosing URL instead of DISK as the “Back up to:” destination, you are prompted for the file name, credential, and container.  This will automatically generate the url for backing up the database. 

 Backup to URL GUI

Restoring a backup from Azure Blob Storage

Restoring a backup from Azure Blob Storage to your SQL Server instance (on-prem or Azure) is just as easy as taking the backup.  Your options are to download the file, or restore from URL.  To download the file, simply navigate in the Azure web portal to the container with your backups, and click “Download”.  You can then restore the file using commands exactly as you would with any other on-prem file.  If you choose to restore from URL, the modifications are the same as the changes when backing up – create the same credential as you did for taking the backup, and in your backup statement indicate a URL, and the credential.

RESTORE DATABASE AdventureWorks2012
FROM URL = 'https://blogdemostorage.blob.core.windows.net/sqlbackups/AdventureWorks2012.bak'
WITH
	CREDENTIAL = 'BackupCredential';

 As with restoring a backup from disk, you can use MOVE to put the files in a different location.

Adopting a cloud backup strategy:  Compatibility

If you choose to use Azure Blob Storage for your backup strategy, you may have an environment with instances that are not on SQL Server 2012 CU4 or beyond.  This matter is resolved using the SQL Server Backup to Azure Tool.  Simply install this tool, point it to a folder, and it manages the transmission of files in that folder to Azure Blob Storage.  This tool also allows you to compress and encrypt backups for transmission, where the version of SQL Server that you are running may not.