Design and Implement an Azure Storage Strategy

  • 3/11/2015
In this chapter from Exam Ref 70-532 Developing Microsoft Azure Solutions, you will learn how to implement each of the Azure Storage services, how to monitor them, and how to manage access. You’ll also learn how to work with Azure SQL Database.

Azure Storage and Azure SQL Database both play an important role in the Microsoft Azure Platform-as-a-Service (PaaS) strategy for storage. Azure Storage enables storage and retrieval of large amounts of unstructured data. You can store content files such as documents and media in the Blob service, use the Table service for NoSQL data, use the Queue service for reliable messages, and use the File service for Server Message Block (SMB) file share scenarios. Azure SQL Database provides classic relational database features as part of an elastic scale service.

In this chapter, you will learn how to implement each of the Azure Storage services, how to monitor them, and how to manage access. You’ll also learn how to work with Azure SQL Database.

Objectives in this chapter:

  • Objective 4.1: Implement Azure Storage blobs and Azure files
  • Objective 4.2: Implement Azure Storage tables
  • Objective 4.3: Implement Azure Storage queues
  • Objective 4.4: Manage access
  • Objective 4.5: Monitor storage
  • Objective 4.6: Implement SQL databases

Objective 4.1: Implement Azure Storage blobs and Azure files

Azure blob storage is the place to store unstructured data of many varieties. You can store images, video files, word documents, lab results, and any other binary file you can think of. In addition, Azure uses blob storage extensively. For instance, when you mount extra logical drives in an Azure virtual machine (VM), the drive image is actually stored in by the Blob service associated with an Azure blob storage account. In a blob storage account, you can have many containers. Containers are similar to folders in that you can use them to logically group your files. You can also set security on the entire container. Each blob storage account can store up to 500 terabytes of data.

All blobs can be accessed through a URL format. It looks like this:

http://<storage account name><container name>/<blob name>

The Azure File service provides an alternative to blob storage for shared storage, accessible via SMB 2.1 protocol.

Creating a container

This section explains how to create a container and upload a file to blob storage for later reading.

Creating a container (existing portal)

To create a container in the management portal, complete the following steps:

  1. Navigate to the Containers tab for your storage account in the management portal accessed via
  2. Click Add on the command bar. If you do not yet have a container, you can click Create A Container, as shown in Figure 4-1.

    Figure 4-1

    FIGURE 4-1 The option to create a container for a storage account that has no containers

  3. Give the container a name, and select Public Blob for the access rule, as shown in Figure 4-2.

    Figure 4-2

    FIGURE 4-2 New container dialog box

  4. The URL for the container can be found in the container list, shown in Figure 4-3. You can add additional containers by clicking Add at the bottom of the page on the Containers tab.

    Figure 4-3

    FIGURE 4-3 Containers tab with a list of containers and their URLs

Creating a container (Preview portal)

To create a container in the Preview portal, complete the following steps:

  1. Navigate to the management portal accessed via
  2. Click Browse on the command bar.
  3. Select Storage from the Filter By drop-down list.
  4. Select your storage account from the list on the Storage blade.
  5. Click the Containers box.
  6. On the Containers blade, click Add on the command bar.
  7. Enter a name for the container, and select Blob for the access type, as shown in Figure 4-4.

    Figure 4-4

    FIGURE 4-4 The Add A Container blade

  8. The URL for the container can be found in the container list, as shown in Figure 4-5.

    Figure 4-5

    FIGURE 4-5 Containers blade with a list of containers and URLs

Finding your account access key

To access your storage account, you need the account name that was used to build the URL to the account and the primary access key. This section covers how to find the access keys for storage accounts.

Finding your account access key (existing portal)

To find your account access key using the management portal, complete the following steps:

  1. Click the Dashboard tab for your storage account.
  2. Click Manage Keys to find the primary and secondary key for managing your account, as shown in Figure 4-6. Always use the primary key for management activities (to be discussed later in this chapter).

    Figure 4-6

    FIGURE 4-6 Manage Access Keys dialog box for a storage account

Finding your account access key (Preview portal)

To find your account access key using the Preview portal, complete the following steps:

  1. Navigate to your storage account blade.
  2. Click the Keys box on the storage account blade (see Figure 4-7).

    Figure 4-7

    FIGURE 4-7 Manage Keys blade

Uploading a blob

You can upload files to blob storage using many approaches, including the following:

To upload a blob using AzCopy, complete the following steps:

  1. Download AZCopy from Run the .msi file downloaded from this link.
  2. Open a command prompt and navigate to C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.
  3. Create a text file in a folder that is easy to get to. Insert some random text in it.
  4. In the command window, type a command that looks like this: AzCopy /Source:c:\test /Dest: /DestKey:key /Pattern:*.txt.
  5. Press Enter to issue the command to transfer the file.

Reading data

You can anonymously read blob storage content directly using a browser if public access to blobs is enabled. The URL to your blob content takes this format:

  • https://<your account name><your container name>/<your path and filename>

Reading blobs via a browser

Many storage browsing tools provide a way to view the contents of your blob containers. You can also navigate to the container using the existing management portal or the Preview portal to view the list of blobs. When you browse to the blob URL, the file is downloaded and displayed in the browser according to its content type.

Reading blobs using Visual Studio

You can also use Server Manager in Visual Studio 2013 to view the contents of your blob containers and upload or download files.

  1. Navigate to the blob storage account that you want to use.
  2. Double-click the blob storage account to open a window showing a list of blobs and providing functionality to upload or download blobs.

Changing data

You can modify the contents of a blob or delete a blob using the Storage API directly, but it is more common to do this programmatically as part of an application, for example using the Storage Client Library.

The following steps illustrate how to update a blob programmatically. Note that this example uses a block blob. The distinction between block and page blobs is discussed in “Storing data using block and page blobs” later in this chapter.

  1. Create a C# console application.
  2. In your app.config file, create a storage configuration string and entry, replacing AccountName and AccountKey with your storage account values:

        <add key="StorageConnectionString" value="DefaultEndpointsProtocol=https;Accou
    ntName=<your account name>;AccountKey=<your account key>" />
  3. Use NuGet to obtain the Microsoft.WindowsAzure.Storage.dll. An easy way to do this is by using this command in the NuGet console:

    Install-package –version 3.0.3
  4. Create a new console application, and add the following using statements to the top of your Program.cs file:

    using Microsoft.WindowsAzure.Storage;
    using Microsoft.WindowsAzure.Storage.Auth;
    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.Storage.Blob;
    using System.Configuration
  5. Add a reference to System.Configuration. Add the following code in the main entry point:

    var storageAccount = CloudStorageAccount.Parse( ConfigurationManager.AppSettings["
  6. Use CloudBlobClient to gain access to the containers and blobs in your Azure storage account. After it is created, you can set permissions to make it publicly available:

    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
  7. Use a CreateIfNotExists method to ensure a container is there before you interact with it:

    CloudBlobContainer container = blobClient.GetContainerReference("files");
    container.SetPermissions(new BlobContainerPermissions {PublicAccess =
    BlobContainerPublicAccessType.Blob });
  8. To upload a file, use the FileStream object to access the stream, and then use the UploadFromFileStream method on the CloudBlockBlob class to upload the file to Azure blob storage:

    CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob");
    using (var fileStream = System.IO.File.OpenRead(@"path\myfile"))
  9. To list all of the blobs, use the following code:

    foreach (IListBlobItem item in container.ListBlobs(null, false))
      if (item.GetType() == typeof(CloudBlockBlob))
        CloudBlockBlob blob = (CloudBlockBlob)item;
        Console.WriteLine("Block blob of length {0}: {1}", blob.Properties.Length,
      else if (item.GetType() == typeof(CloudPageBlob))
        CloudPageBlob pageBlob = (CloudPageBlob)item;
        Console.WriteLine("Page blob of length {0}: {1}", pageBlob.Properties.Length,
      else if (item.GetType() == typeof(CloudBlobDirectory))
        CloudBlobDirectory directory = (CloudBlobDirectory)item;
        Console.WriteLine("Directory: {0}", directory.Uri);
  10. To download blobs, use the CloudBlobContainer class:

    CloudBlockBlob blockBlob = container.GetBlockBlobReference("photo1.jpg");
    using (var fileStream = System.IO.File.OpenWrite(@"path\myfile"))
  11. To delete a blob, get a reference to the blob and call Delete():

    CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.txt");

Setting metadata on a container

Blobs and containers have metadata attached to them. There are two forms of metadata:

  • System properties metadata
  • User-defined metadata

System properties can influence how the blob behaves, while user-defined metadata is your own set of name/value pairs that your applications can use. A container has only read-only system properties, while blobs have both read-only and read-write properties.

Setting user-defined metadata

To set user-defined metadata for a container, get the container reference using GetContainerReference(), and then use the Metadata member to set values. After setting all the desired values, call SetMetadata() to persist the values, as in the following example:

CloudBlobContainer container = blobClient.GetContainerReference("files");
files.Metadata["counter"] = "100";

Reading user-defined metadata

To read user-defined metadata for a container, get the container reference using GetContainerReference(), and then use the Metadata member to retrieve a dictionary of values and access them by key, as in the following example:

CloudBlobContainer container = blobClient.GetContainerReference("files");
Console.WriteLine("counter value: " + files.Metadata["counter"];

Reading system properties

To read a container’s system properties, first get a reference to the container using GetContainerReference(), and then use the Properties member to retrieve values. The following code illustrates accessing container system properties:

CloudBlobContainer container = blobClient.GetContainerReference("files");
Console.WriteLine("LastModifiedUTC: " + container.Properties.LastModified);
Console.WriteLine("ETag: " + container.Properties.ETag);

Storing data using block and page blobs

The Azure Blob service has two different ways of storing your data: block blobs and page blobs. Block blobs are great for streaming data sequentially, like video and other files. Page blobs are great for non-sequential reads and writes, like the VHD on a hard disk mentioned in earlier chapters.

Block blobs are blobs that are divided into blocks. Each block can be up to 4 MB. When uploading large files into a block blob, you can upload one block at a time in any order you want. You can set the final order of the block blob at the end of the upload process. For large files, you can also upload blocks in parallel. Each block will have an MD5 hash used to verify transfer. You can retransmit a particular block if there’s an issue. You can also associate blocks with a blob after upload, meaning that you can upload blocks and then assemble the block blob after the fact. Any blocks you upload that aren’t committed to a blob will be deleted after a week. Block blobs can be up to 200 GB.

Page bobs are blobs comprised of 512-byte pages. Unlike block blobs, page blob writes are done in place and are immediately committed to the file. The maximum size of a page blob is 1 terabyte. Page blobs closely mimic how hard drives behave, and in fact, Azure VMs use them for that purpose. Most of the time, you will use block blobs.

Streaming data using blobs

You can stream blobs by downloading to a stream using the DownloadToStream() API method. The advantage of this is that it avoids loading the entire blob into memory, for example before saving it to a file or returning it to a web request.

Accessing blobs securely

Secure access to blob storage implies a secure connection for data transfer and controlled access through authentication and authorization.

Azure Storage supports both HTTP and secure HTTPS requests. For data transfer security, you should always use HTTPS connections. To authorize access to content, you can authenticate in three different ways to your storage account and content:

  • Shared Key Constructed from a set of fields related to the request. Computed with a SHA-256 algorithm and encoded in Base64.
  • Shared Key Lite Similar to Shared Key, but compatible with previous versions of Azure Storage. This provides backwards compatibility with code that was written against versions prior to 19 September 2009. This allows for migration to newer versions with minimal changes.
  • Shared Access Signature Grants restricted access rights to containers and blobs. You can provide a shared access signature to users you don’t trust with your storage account key. You can give them a shared access signature that will grant them specific permissions to the resource for a specified amount of time. This is discussed in a later section.

To interact with blob storage content authenticated with the account key, you can use the Storage Client Library as illustrated in earlier sections. When you create an instance of the CloudStorageAccount using the account name and key, each call to interact with blob storage will be secured, as shown in the following code:

string accountName = "ACCOUNTNAME";
string accountKey = "ACCOUNTKEY";
CloudStorageAccount storageAccount = new CloudStorageAccount(new
StorageCredentials(accountName, accountKey), true);

Implementing an async blob copy

The Blob service provides a feature for asynchronously copying blobs from a source blob to a destination blob. You can run many of these requests in parallel since the operation is asynchronous. The following scenarios are supported:

  • Copying a source blob to a destination with a different name or URI
  • Overwriting a blob with the same blob, which means copying from the same source URI and writing to the same destination URI (this overwrites the blob, replaces metadata, and removes uncommitted blocks)
  • Copy a snapshot to a base blob, for example to promote the snapshot to restore an earlier version
  • Copy a snapshot to a new location creating a new, writable blob (not a snapshot)

The copy operation is always the entire length of the blob; you can’t copy a range.

The following code illustrates a simple example for creating a blob and then copying it asynchronously to another destination blob:

CloudBlobContainer files = blobClient.GetContainerReference("files");
ICloudBlob sourceBlob = files.GetBlockBlobReference("filetocopy.txt");
sourceBlob.Properties.ContentType = "text/plain";
string sourceFileContents = "my text blob to copy";
byte[] sourceBytes = new byte[sourceFileContents.Length * sizeof(char)];
System.Buffer.BlockCopy(sourceFileContents.ToCharArray(), 0, sourceBytes, 0,
sourceBlob.UploadFromByteArray(sourceBytes, 0, sourceBytes.Length);

ICloudBlob blobCopy = files.GetBlockBlobReference("destinationcopy.txt");
AsyncCallback cb = new AsyncCallback(x => Console.WriteLine("copy completed with {0}",
blobCopy.BeginStartCopyFromBlob(sourceBlob.Uri, cb, null);

Ideally, you pass state to the BeginStartCopyFromBlob() method so that you can track multiple parallel operations.

Configuring the Content Delivery Network

The Azure Content Delivery Network (CDN) distributes content across geographic regions to edge nodes across the globe. The CDN caches publicly available objects so they are available over high-bandwidth connections, close to the users, thus allowing the users to download them at much lower latency. You may be familiar with using CDNs to download popular Javascript frameworks like JQuery, Angular, and others.

By default, blobs have a seven-day time-to-live (TTL) at the CDN edge node. After that time elapses, the blob is refreshed from the storage account to the edge node. Blobs that are shared via CDN must support anonymous access.

Configuring the CDN (existing portal)

To enable the CDN for a storage account in the management portal, complete the following steps:

  1. In the management portal, click New on the navigation bar.
  2. Select App Services, CDN, Quick Create.
  3. Select the storage account that you want to add CDN support for, and click Create.
  4. Navigate to the CDN properties by selecting it from your list of CDN endpoints.
  5. To enable HTTPS support, click Enable HTTPS at the bottom of the page.
  6. To enable query string support, click Enable Query String Support at the bottom of the page.
  7. To map a custom domain to the CDN endpoint, click Manage Domains at the bottom of the page, and follow the instructions.

To access blobs via CDN, use the CDN address as follows:

http://<your CDN subdomain><your container name>/<your blob path>

If you are using HTTPS and a custom domain, address your blobs as follows:

https://<your domain>/<your container name>/<your blob path>

Configuring the CDN (Preview portal)

You currently cannot configure the CDN using the Preview portal.

Designing blob hierarchies

Blob storage has a hierarchy that involves the following aspects:

  • The storage account name, which is part of the base URI
  • The container within which you store blobs, which is also used for partitioning
  • The blob name, which can include path elements separated by a backslash (/) to create a sense of folder structure

Using a blob naming convention that resembles a directory structure provides you with additional ways to filter your blob data directly from the name. For example, to group images by their locale to support a localization effort, complete the following steps:

  1. Create a container called images.
  2. Add English bitmaps using the convention en/bmp/*, where * is the file name.
  3. Add English JPEG files using the convention en/jpg/*, where * is the file name.
  4. Add Spanish bitmaps using the convention sp/bmp/*, where * is the file name.
  5. Add Spanish JPEG files using the convention sp/jpg/*, where * is the file name.

To retrieve all images in the container, use ListBlob() in this way:

var list = images.ListBlobs(null, true, BlobListingDetails.All);

The output is the entire list of uploaded images in the container:

To filter only those with the prefix en, use this:

var list = images.ListBlobs("en", true, BlobListingDetails.All);

The output will be this:

Configuring custom domains

By default, the URL for accessing the Blob service in a storage account is https://<your account name> You can map your own domain or subdomain to the Blob service for your storage account so that users can reach it using the custom domain or subdomain.

Scaling Blob storage

Blobs are partitioned by container name and blob name, which means each blob can have its own partition. Blobs, therefore, can be distributed across many servers to scale access even though they are logically grouped within a container.

Working with Azure File storage

Azure File storage provides a way for applications to share storage accessible via SMB 2.1 protocol. It is particularly useful for VMs and cloud services as a mounted share, and applications can use the File Storage API to access File storage.

Objective summary

  • A blob container has several options for access permissions. When set to Private, all access requires credentials. When set to Public Container, no credentials are required to access the container and its blobs. When set to Public Blob, only blobs can be accessed without credentials if the full URL is known.
  • To access secure containers and blobs, you can use the storage account key or a shared access signatures.
  • AzCopy is a useful utility for activities such as uploading blobs, transferring blobs from one container or storage account to another, and performing these and other activities related to blob management in scripted batch operations.
  • Block blobs allow you to upload, store, and download large blobs in blocks up to 4 MB each. The size of the blob can be up to 200 GB.
  • You can use a blob naming convention akin to folder paths to create a logical hierarchy for blobs, which is useful for query operations.

Objective review

Answer the following questions to test your knowledge of the information in this objective. You can find the answers to these questions and explanations of why each answer choice is correct or incorrect in the “Answers” section at the end of this chapter.

  1. Which of the following is not true about metadata? (Choose all that apply.)

    1. Both containers and blobs have writable system properties.
    2. Blob user-defined metadata is accessed as a key value pair.
    3. System metadata can influence how the blog is stored and accessed in Azure Storage.
    4. Only blobs have metadata; containers do not.
  2. Which of the following are valid differences between page blobs and block blobs? (Choose all that apply.)

    1. Page blobs are much faster for all operations.
    2. Block blobs allow files to be uploaded and assembled later. Blocks can be resubmitted individually.
    3. Page blobs are good for all sorts of files, like video and images.
    4. Block blobs have a max size of 200 GB. Page blobs can be 1 terabyte.
  3. What are good recommendations for securing files in Blob storage? (Choose all that apply.)

    1. Always use SSL.
    2. Keep your primary and secondary keys hidden and don’t give them out.
    3. In your application, store them someplace that isn’t embedded in client-side code that users can see.
    4. Make the container publicly available.