## Overview ##
In this lab, you will learn the basics of Windows Azure Storage, how to create and configure storage accounts and how you can programmatically access the different types of storage service. Blobs, Tables, and Queues are all available as part of the Windows Azure Storage account, and provide durable storage on the Windows Azure platform. These services are accessible from both inside and outside the Windows Azure platform by using the Windows Azure Storage Client SDK, or via URI using [REST APIs] (http://msdn.microsoft.com/en-us/library/dd179355.aspx).
You will learn how the following services work:
Table Storage
Table storage is a collection of row like entities, each of which can contain up to 255 properties. There is no schema that enforces a certain set of values on all the rows within a table, unlike tables in a database. It does not provide any way to represent relationships between data. Windows Azure Storage tables are more like rows within a spreadsheet application such as Excel than rows within a database such as SQL Database, in that each row can contain a different number of columns, and different data types, than the other rows in the same table.
Blob Storage
Blobs provide a way to store large amounts of unstructured, binary data, such as video, audio, images, etc. One of the features of blobs is streaming content such as video or audio.
Queue Storage Queues provide storage for passing messages between applications. Messages stored to the queue are limited to a maximum of 8KB in size, and are generally stored and retrieved on a first in, first out (FIFO) basis (however FIFO is not guaranteed). Processing messages from a queue is a two stage process, which involves getting the message, and then deleting the message after it has been processed. This pattern allows you to implement guaranteed message delivery by leaving the message in the queue until it has been fully processed.
### Objectives ###In this hands-on lab, you will learn how to:
- Create an Storage Account.
- Learn the different configuration options for Geo-Replication, Monitoring and Logging.
- Access to Tables, Blobs and Queues using Windows Azure SDK 2.0 in a MVC Web Application.
The following is required to complete this hands-on lab:
- [Microsoft Visual Studio Express 2012 for Web] 1
- [Windows Azure Tools for Microsoft Visual Studio 2.0] 2
- A Windows Azure subscription - sign up for a free trial
-
Open a Windows Explorer window and browse to the Source folder of this lab.
-
Execute the Setup.cmd file with Administrator privileges to launch the setup process. This process will configure your environment and install the Visual Studio code snippets for this lab.
-
If the User Account Control dialog is shown, confirm the action to proceed.
### Using the Code Snippets ###Note: Make sure you have checked all the dependencies for this lab before running the setup.
Throughout the lab document, you will be instructed to insert code blocks. For your convenience, most of this code is provided as Visual Studio Code Snippets, which you can use from within Visual Studio 2012 to avoid having to add it manually.
## Exercises ##
This hands-on lab includes the following exercises:
- Exercise 1 - Creating a Windows Azure Storage Account
- Exercise 2 - Managing a Windows Azure Storage Account
- Exercise 3 - Understanding the Windows Azure Storage Abstractions
- Exercise 4 - Introducing SAS (Shared Access Signature)
- Exercise 5 - Updating SAS to use Stored Access Policies
Note: Each exercise is accompanied by a starting solution. These solutions are missing some code sections that are to be completed throughout each exercise and therefore will not necessarily work if you run them directly. Inside each exercise you will also find an end folder with the solution you should obtain after completing the exercises. You can use this solution as a guide if you need additional help working through the exercises.
Estimated time to complete this lab: 60 minutes.
### Exercise 1: Creating a Windows Azure Storage Account ###Note: When you first start Visual Studio, you must select one of the predefined settings collections. Every predefined collection is designed to match a particular development style and determines window layouts, editor behavior, IntelliSense code snippets, and dialog box options. The procedures in this lab describe the actions necessary to accomplish a given task in Visual Studio when using the General Development Settings collection. If you choose a different settings collection for your development environment, there may be differences in these procedures that you need to take into account.
This exercise describes how to create a storage account in the Windows Azure Management Portal. To store files and data in the storage services in Windows Azure, you must create a storage account in the geographic region where you want to store the data.
#### Task 1 - Creating a Storage Account from Management Portal ####Note: A storage account can contain up to 100 TB of blob, table, and queue data. You can create up to five storage accounts for each Windows Azure subscription.
In this task you will learn how to create a new Storage Account using the Management Portal.
-
Navigate to http://manage.windowsazure.com using a Web browser and sign in using the Microsoft Account associated with your Windows Azure account.
Logging to the Management Portal
-
In the menu located at the bottom, select New | Data Services | Storage | Quick Create to start creating a new Storage Account. Enter a unique name for the account and select a Region from the list. Click OK to continue.
Creating a new storage account
-
In the Storage section, you will see the Storage Account you created with a Creating status. Wait until it changes to Online in order to continue with the following step.
Storage Account created
-
Click on the storage account name you created. You will enter the Dashboard page which provides you with information about the status of the account and the service endpoints that can be used within your applications.
Displaying the Storage Account Dashboard
In the next exercise, you will configure the storage account to enable Geo-Replication, Monitoring and Logging and manage the Access Keys.
In this exercise, you will configure the common settings for your storage account. You will manage your Access Keys, enabling Geo-Replication and configuring Monitoring and Logging.
#### Task 1 - Enabling Geo-Replication ####Geo-replication replicates the stored content to a secondary location to enable failover to that location in case of a major disaster in the primary location. The secondary location is in the same region, but is hundreds of miles from the primary location. This is the highest level of storage durability, known as geo redundant storage (GRS). Geo-replication is turned on by default.
-
In the Storage Account page, click the Configure tab in the top menu.
Configuring Storage Account
-
You can choose to enable or disable it in the Geo-Replication section.
Enabling Geo-Replication
Note: If you turn off geo-replication, you have locally redundant storage (LRS). For locally redundant storage, account data is replicated three times within the same data center. LRS is offered at discounted rates. Be aware that if you turn off geo-replication, and you later change your mind, you will incur a one-time data cost to replicate your existing data to the secondary location.
From the Monitoring section, you can monitor your storage accounts in the Windows Azure Management Portal. For each storage service associated with the storage account (Blob, Queue, and Table), you can choose the level of monitoring - minimal or verbose - and specify the appropriate data retention policy.
-
In the Configure page, go to the Monitoring section.
Configuring Monitoring Options
-
To set the monitoring level, select one of the following:
Minimal - Collects metrics such as ingress/egress, availability, latency, and success percentages, which are aggregated for the Blob, Table, and Queue services.
Verbose - In addition to the minimal metrics, this settings collects the same set of metrics for each storage operation in the Windows Azure Storage Service API. Verbose metrics enable closer analysis of issues that occur during application operations.
Off - Turns off monitoring. Existing monitoring data is persisted through the end of the retention period.
Note: There are costs considerations when you select monitoring. For more information, see Storage Analytics and Billing.
-
To set the data retention policy, in Retention (in days), type the number of days that data should be retained from 1-365 days. If there is no retention policy (by entering zero value), it is up to you to delete the monitoring data.
Note: It is recommended to set a retention policy based on how long you want to retain storage analytics data for your account so that old and unused analytics data can be deleted by the system at no cost.
-
Once Monitoring is enabled, you can customize the Dashboard to choose up to six metrics to plot on the metrics chart. There are nine available metrics for each service. To do so, go to the Dashboard page.
-
In the Dashboard page, you will see the default metrics displayed on the chart. To add a different metric, click on the More button to display the available metrics. Select one from the list.
Adding Metrics to the Dashboard
Note: You can hide metrics that are plotted on the chart by clearing the check box next to the metric header.
-
By default, the chart shows trends, displaying only the current value of each metric (the Relative option at the top of the chart). To display a Y axis to see absolute values, select Absolute.
Changing Chart values to Absolute
-
To change the time range displayed on the chart, select 6 hours or 24 hours at the top of the chart.
Changing Chart Time Ranges
-
Click Monitor in the top menu. On the Monitor page, you can view the full set of metrics for your storage account.
-
By default, the metrics table displays a subset of the metrics that are available for monitoring. The illustration shows the default Monitor display for a storage account with verbose monitoring configured for all three services. Click the Add Metrics button in the bottom menu.
Adding Metrics
-
In the dialog box, you can choose from a list of different types of metrics for each service. You can select the metrics you want to display in the Monitor table.Click OK to continue.
Select Metrics to Monitor dialog
-
The metrics you selected will be displayed in the Monitor table.
-
You can delete a metric by selecting it and clicking the Delete Metric button in the bottom menu.
Deleting a Metric
You can save diagnostic logs for Read Requests, Write Requests, and/or Delete Requests and can set the data retention policy for each of the services. In this task you will configure logging for your storage account.
-
In the Configure page, go to the Logging section.
-
For each service (Blob, Table or Queue), you can configure the types of request to log: Read Requests, Write Requests, and Delete Requests. You can also configure the number of days to retain the logged data. Enter zero if you do not want to set a retention policy. If you do not set a retention policy, it is up to you to delete the logs.
Configuring Logging Options
Note: The diagnostics logs are saved in a blob container named $logs in your storage account. For information about accessing the $logs container, see About Storage Analytics Logging.
When you create a storage account, Windows Azure generates two 512-bit storage access keys which are used for authentication when the storage account is accessed. By providing two storage access keys, Windows Azure enables you to regenerate the keys with no interruption to your storage service.
-
In the Storage Account Dashboard, select the option Manage Access Keys in the bottom menu.
Managing Access Keys
-
You can use Manage Keys to copy a storage access key to use in a connection string. The connection string requires the storage account name and a key to use in authentication. Take note of the Primary access key and the storage account name as they will be used in the following exercise.
Copying Access Keys
-
By clicking the Regenerate button, a new Access Key is created. You should change the access keys to your storage account periodically to help keep your storage connections more secure. Two access keys are assigned to enable you to maintain connections to the storage account using one access key while you regenerate the other access key.
Note: Regenerating your access keys affects virtual machines, media services, and any applications that are dependent on the storage account.
In the next exercise, you will consume Windows Azure Storage services from an MVC application.
###Exercise 3: Understanding the Windows Azure Storage Abstractions ###
This sample application is comprised of five Views, one for each CRUD operation (Create, Read, Update, Delete) and one to list all the entities from the Table Storage. In this exercise, you will update the MVC application actions to perform operations against each storage service (Table, Blob and Queue) using Windows Azure SDK v2.0.
#### Task 1 - Configuring Storage Account in the Cloud Project ####In this task you will configure the storage connection string of the application with the storage account you previously created using the Access Key from the previous exercise.
-
Open Visual Studio Express 2012 for Web as Administrator.
-
Browse to the *Source\Ex3-UnderstandingStorageAbstractions\Begin* folder of this lab and open the Begin.sln solution. Make sure to set the PhotoUploader cloud project as the default project.
-
Go to the PhotoUploader_WebRole located in the Roles folder of the PhotoUploader solution. Right-click it and select Properties.
Web Role Properties
-
Go to the Settings tab and locate the StorageConnectionString setting. Click the ellipsis next to the UseDevelopmentStorage=true value.
Settings Tab
-
Select Manually Entered Credentials and set the Account name and Account key values from the previous exercise.
Create Storage Connection String Dialog Box
-
Click OK to update the connection string.
-
Repeat the previous steps to configure the StorageConnectionString for the QueueProcessor_WorkerRole.
In this task you will update the MVC application actions to perform operations against the Table Storage. You are going to use Table Storage to save information from the photo uploaded such as Title and Description.
-
Open PhotoEntity.cs in the Models folder and add the following directives.
using Microsoft.WindowsAzure.Storage.Table;
-
Update the class to inherit from TableEntity. The TableEntity has a PartitionKey and RowKey property that need to be set when adding a new row to the Table Storage. To do so, add the following Constructor and inherit the class from TableEntity.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-InheritingTableEntity)
public class PhotoEntity : TableEntity { public PhotoEntity() { PartitionKey = "Photo"; RowKey = Guid.NewGuid().ToString(); } ... }
-
Now you will add a new class to implement a TableServiceContext to interact with Table Storage. Right-click the Models folder and select Add | Class.
Adding a new class
-
In the Add New Item window, set the name of the class to PhotoDataServiceContext.cs and click Add.
PhotoDataServiceContext class
-
Add the following directives to the PhotoDataServiceContext class.
using Microsoft.WindowsAzure.Storage.Table; using Microsoft.WindowsAzure.Storage.Table.DataServices;
-
Replace the class content with the following code:
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-PhotoDataServiceContext)
public class PhotoDataServiceContext : TableServiceContext { public PhotoDataServiceContext(CloudTableClient client): base(client) { } public IEnumerable<PhotoEntity> GetPhotos() { return this.CreateQuery<PhotoEntity>("Photos"); } }
Note: You need to make the class inherit from TableServiceContext to interact with Table Storage.
-
Now, you will add an operation to retrieve a single entity from the table. Add the following code to the PhotoDataServiceContext class:
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-DataContextGetById)
public PhotoEntity GetById(string partitioKey, string rowKey) { CloudTable table = this.ServiceClient.GetTableReference("Photos"); TableOperation retrieveOperation = TableOperation.Retrieve<PhotoEntity>(partitioKey, rowKey); TableResult retrievedResult = table.Execute(retrieveOperation); if (retrievedResult.Result != null) return (PhotoEntity)retrievedResult.Result; else return null; }
Note: The previous code uses a TableOperation to retrieve the photo with the specific RowKey. This method returns just one entity, rather than a collection, and the returned value in TableResult.Result is a PhotoEntity.
-
In order to add a new entity, you can use the Insert table operation. Add the following code to implement it:
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-DataContextAddPhoto)
public void AddPhoto(PhotoEntity photo) { TableOperation operation = TableOperation.Insert(photo); CloudTable table = this.ServiceClient.GetTableReference("Photos"); table.Execute(operation); }
Note: To prepare the insert operation, a TableOperation is created to insert the photo entity into the table. The operation is then executed by calling CloudTable.Execute.
-
Update operations are similar to insert, but first we need to retrieve the entity and then use a Replace table operation. Add the following code:
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-DataContextUpdatePhoto)
public void UpdatePhoto(PhotoEntity photo) { CloudTable table = this.ServiceClient.GetTableReference("Photos"); TableOperation retrieveOperation = TableOperation.Retrieve<PhotoEntity>(photo.PartitionKey, photo.RowKey); TableResult retrievedResult = table.Execute(retrieveOperation); PhotoEntity updateEntity = (PhotoEntity)retrievedResult.Result; if (updateEntity != null) { updateEntity.Description = photo.Description; updateEntity.Title = photo.Title; TableOperation replaceOperation = TableOperation.Replace(updateEntity); table.Execute(replaceOperation); } }
-
To delete an entity, we need to first retrieve it from the table and then execute a Delete table operation. Add the following code to implement it:
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-DataContextDeletePhoto)
public void DeletePhoto(PhotoEntity photo) { CloudTable table = this.ServiceClient.GetTableReference("Photos"); TableOperation retrieveOperation = TableOperation.Retrieve<PhotoEntity>(photo.PartitionKey, photo.RowKey); TableResult retrievedResult = table.Execute(retrieveOperation); PhotoEntity deleteEntity = (PhotoEntity)retrievedResult.Result; if (deleteEntity != null) { TableOperation deleteOperation = TableOperation.Delete(deleteEntity); table.Execute(deleteOperation); } }
-
Open HomeController.cs in the Controllers folder. We'll update the controller's actions to execute the table operations from the DataContext you created in the previous steps. Add the following using directives.
using Microsoft.WindowsAzure; using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Table;
-
Add a private field to create a StorageAccount object. This object will be used to perform operations for each storage service.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-StorageAccountVariable)
public class HomeController : Controller { private CloudStorageAccount StorageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString")); ... }
-
In order to display the entities in the View, you will convert them to a ViewModel class. You are going to add two helper methods to convert from ViewModel to a Model and from a Model to a ViewModel. Add the following methods at the end of the class declaration.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-ViewModelHelpers)
private PhotoViewModel ToViewModel(PhotoEntity photo) { return new PhotoViewModel { PartitionKey = photo.PartitionKey, RowKey = photo.RowKey, Title = photo.Title, Description = photo.Description }; } private PhotoEntity FromViewModel(PhotoViewModel photoViewModel) { var photo = new PhotoEntity { Title = photoViewModel.Title, Description = photoViewModel.Description }; photo.PartitionKey = photoViewModel.PartitionKey ?? photo.PartitionKey; photo.RowKey = photoViewModel.RowKey ?? photo.RowKey; return photo; }
-
The Home page will display a list of entities from Table Storage. For it to do this, replace the Index action to retrieve the entire list of entities from the Table Storage using the PhotoDataServiceContext with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStorageIndex)
public ActionResult Index() { CloudTableClient cloudTableClient = this.StorageAccount.CreateCloudTableClient(); var photoContext = new PhotoDataServiceContext(cloudTableClient); return this.View(photoContext.GetPhotos().Select(x => this.ToViewModel(x)).ToList()); }
-
The Details view will show specific information of a particular photo. Replace the Details action with the following code to display the information of a single entity using the PhotoDataServiceContext.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStorageDetails)
public ActionResult Details(string partitionKey, string rowKey) { CloudTableClient cloudTableClient = this.StorageAccount.CreateCloudTableClient(); var photoContext = new PhotoDataServiceContext(cloudTableClient); PhotoEntity photo = photoContext.GetById(partitionKey, rowKey); if (photo == null) { return HttpNotFound(); } var viewModel = this.ToViewModel(photo); return this.View(viewModel); }
-
Replace the Create POST action with the following code to insert a new entity in the Table Storage.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStorageCreate)
[HttpPost] public ActionResult Create(PhotoViewModel photoViewModel, HttpPostedFileBase file, FormCollection collection) { if (this.ModelState.IsValid) { var photo = this.FromViewModel(photoViewModel); photo.PartitionKey = this.User.Identity.IsAuthenticated ? this.User.Identity.Name : "Public"; // Save information to Table Storage CloudTableClient cloudTableClient = this.StorageAccount.CreateCloudTableClient(); var photoContext = new PhotoDataServiceContext(cloudTableClient); photoContext.AddPhoto(photo); return this.RedirectToAction("Index"); } return this.View(); }
-
Replace the Edit GET Action with the following code to retrieve existing entity information from Table Storage.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStorageEdit)
public ActionResult Edit(string partitionKey, string rowKey) { CloudTableClient cloudTableClient = this.StorageAccount.CreateCloudTableClient(); var photoContext = new PhotoDataServiceContext(cloudTableClient); PhotoEntity photo = photoContext.GetById(partitionKey, rowKey); if (photo == null) { return this.HttpNotFound(); } var viewModel = this.ToViewModel(photo); return this.View(viewModel); }
-
Replace the Edit POST action with the following code to update an existing entity in the table storage.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStoragePostEdit)
[HttpPost] [ValidateAntiForgeryToken] public ActionResult Edit(PhotoViewModel photoViewModel, FormCollection collection) { if (ModelState.IsValid) { var photo = this.FromViewModel(photoViewModel); //Update information in Table Storage CloudTableClient cloudTableClient = this.StorageAccount.CreateCloudTableClient(); var photoContext = new PhotoDataServiceContext(cloudTableClient); photoContext.UpdatePhoto(photo); return this.RedirectToAction("Index"); } return this.View(); }
-
Replace the Delete GET Action with the following code to retrieve existing entity data from Table Storage.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStorageDelete)
public ActionResult Delete(string partitionKey, string rowKey) { CloudTableClient cloudTableClient = this.StorageAccount.CreateCloudTableClient(); var photoContext = new PhotoDataServiceContext(cloudTableClient); PhotoEntity photo = photoContext.GetById(partitionKey, rowKey); if (photo == null) { return this.HttpNotFound(); } var viewModel = this.ToViewModel(photo); return this.View(viewModel); }
-
Replace the DeleteConfirmed action with the following code to delete an existing entity from the table.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStoragePostDelete)
[HttpPost, ActionName("Delete")] [ValidateAntiForgeryToken] public ActionResult DeleteConfirmed(string partitionKey, string rowKey) { CloudTableClient cloudTableClient = this.StorageAccount.CreateCloudTableClient(); var photoContext = new PhotoDataServiceContext(cloudTableClient); PhotoEntity photo = photoContext.GetById(partitionKey, rowKey); photoContext.DeletePhoto(photo); return this.RedirectToAction("Index"); }
-
In order to be able to work with Table Storage, we first need to have the table created. Data tables should only be created once. Typically, you would do this during a provisioning step and rarely in application code. The Application_Start method in the Global.asax class is a recommended for this initialization logic. To complete the step, open Global.asax.cs and add the following using directives.
using Microsoft.WindowsAzure; using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Table;
-
Add the following code at the end of the Application_Start method.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-TableStorageAppStart)
protected void Application_Start() { AreaRegistration.RegisterAllAreas(); WebApiConfig.Register(GlobalConfiguration.Configuration); FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters); RouteConfig.RegisterRoutes(RouteTable.Routes); BundleConfig.RegisterBundles(BundleTable.Bundles); CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString")); CloudTableClient cloudTableClient = storageAccount.CreateCloudTableClient(); CloudTable table = cloudTableClient.GetTableReference("Photos"); table.CreateIfNotExists(); }
-
Press F5 and run the application.
Index Home Page
-
Create a new entity. To do so, click the Create link.
-
Complete the Title and Description fields and Submit the form
Create Image Form
Note: You can ignore the Upload file input in this exercise.
-
Close the browser to stop the application.
In this task you will configure the MVC application to upload images to Blob Storage.
-
Open HomeController.cs and add the following directives to work with Blobs.
using Microsoft.WindowsAzure.Storage.Blob;
-
Add the following helper method at the end of the class that allows you to retrieve the blob container from the storage account that will be used to store the images.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-BlobHelper)
private CloudBlobContainer GetBlobContainer() { var client = this.StorageAccount.CreateCloudBlobClient(); var container = client.GetContainerReference(CloudConfigurationManager.GetSetting("ContainerName")); if (container.CreateIfNotExists()) { container.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob }); } return container; }
-
Now, you will update the Create action of the HomeController to upload an image to a blob. You will save the blob reference name in the table to reference it in the future. To do this, add the following code in the Create POST action method.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-BlobCreate)
[HttpPost] public ActionResult Create(PhotoViewModel photoViewModel, HttpPostedFileBase file, FormCollection collection) { if (this.ModelState.IsValid) { photoViewModel.PartitionKey = this.User.Identity.IsAuthenticated ? this.User.Identity.Name : "Public"; var photo = this.FromViewModel(photoViewModel); if (file != null) { // Save file stream to Blob Storage var blob = this.GetBlobContainer().GetBlockBlobReference(file.FileName); blob.Properties.ContentType = file.ContentType; blob.UploadFromStream(file.InputStream); photo.BlobReference = file.FileName; } else { this.ModelState.AddModelError("File",new ArgumentNullException("file")); return this.View(photoViewModel); } //Save information to Table Storage ... } return this.View(); }
-
In the Details action, you will need to display the image that was stored in the blob container. To do this, you need to retrieve the URL using the Blob Reference name that was saved when creating a new entity. Add the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-BlobDetails)
public ActionResult Details(string partitionKey, string rowKey) { ... var viewModel = this.ToViewModel(photo); if (!string.IsNullOrEmpty(photo.BlobReference)) { viewModel.Uri = this.GetBlobContainer().GetBlockBlobReference(photo.BlobReference).Uri.ToString(); } return this.View(viewModel); }
-
Add the same line of code for the Edit GET Action to get the image when editing.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-BlobEdit)
public ActionResult Edit(string partitionKey, string rowKey) { ... var viewModel = this.ToViewModel(photo); if (!string.IsNullOrEmpty(photo.BlobReference)) { viewModel.Uri = this.GetBlobContainer().GetBlockBlobReference(photo.BlobReference).Uri.ToString(); } return this.View(viewModel); }
-
Add the same code line for the Delete GET Action to get the image when deleting.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-BlobDelete)
public ActionResult Delete(string partitionKey, string rowKey) { ... var viewModel = this.ToViewModel(photo); if (!string.IsNullOrEmpty(photo.BlobReference)) { viewModel.Uri = this.GetBlobContainer().GetBlockBlobReference(photo.BlobReference).Uri.ToString(); } return this.View(viewModel); }
-
To delete the blob from the container, you will use the blob reference name to retrieve the container and perform a delete operation. To do this, add the following code to the DeleteConfirmed action.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-BlobPostDelete)
[HttpPost, ActionName("Delete")] [ValidateAntiForgeryToken] public ActionResult DeleteConfirmed(string partitionKey, string rowKey) { //Delete information From Table Storage ... //Deletes the Image from Blob Storage if (!string.IsNullOrEmpty(photo.BlobReference)) { var blob = this.GetBlobContainer().GetBlockBlobReference(photo.BlobReference); blob.DeleteIfExists(); } return this.RedirectToAction("Index"); }
-
Press F5 to run the application.
-
Browse for an image, insert a title and a description for it and then click Create to perform the upload.
Upload image
Note: You can use one of the images that are included in this lab in the Assets folder.
-
Go to the Details page to check that the image uploaded successfully and then close the browser.
In this task, you will use queues to simulate a notification service, where a message is sent to a worker role for processing.
-
Open HomeController.cs and add the following directive.
using Microsoft.WindowsAzure.Storage.Queue;
-
You will add the following helper method at the end of the class to retrieve the Cloud Queue object.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-QueueHelper)
private CloudQueue GetCloudQueue() { var queueClient = this.StorageAccount.CreateCloudQueueClient(); var queue = queueClient.GetQueueReference("messagequeue"); queue.CreateIfNotExists(); return queue; }
-
To notify that a new photo has uploaded, you must insert a message to the Queue with the specific text to be displayed. Add the following highlighted code in the Create POST action method.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-QueueSendMessageCreate)
[HttpPost] public ActionResult Create(PhotoViewModel photoViewModel, HttpPostedFileBase file, FormCollection collection) { if (this.ModelState.IsValid) { ... photoContext.AddPhoto(photo); //Send create notification try { var msg = new CloudQueueMessage("Photo Uploaded"); this.GetCloudQueue().AddMessage(msg); } catch (Exception e) { System.Diagnostics.Trace.TraceInformation("Error", "Couldn't send notification"); } return this.RedirectToAction("Index"); } return this.View(); }
-
To notify that a photo was deleted, you must insert a message to the Queue with the specific text to be displayed. Add the following highlighted code to the DeleteConfirmed action method.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-QueueSendMessageDelete)
[HttpPost, ActionName("Delete")] [ValidateAntiForgeryToken] public ActionResult DeleteConfirmed(string id) { ... try { //Send delete notification var msg = new CloudQueueMessage("Photo Deleted"); this.GetCloudQueue().AddMessage(msg); } catch (Exception e) { System.Diagnostics.Trace.TraceInformation("Error", "Couldn't send notification"); } return this.RedirectToAction("Index"); }
-
Open the WorkerRole.cs file located in the QueueProcessor_WorkerRole project.
-
The worker role will read the Queue for notification messages. First, you need to get a queue reference. To do this, add the following highlighted code in the Run method.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-QueueWorkerAccount)
public override void Run() { // This is a sample worker implementation. Replace with your logic. Trace.TraceInformation("QueueProcessor_WorkerRole entry point called", "Information"); // Initialize the account information var storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString")); // retrieve a reference to the messages queue var queueClient = storageAccount.CreateCloudQueueClient(); var queue = queueClient.GetQueueReference("messagequeue"); while (true) { Thread.Sleep(10000); Trace.TraceInformation("Working", "Information"); } }
-
Now, add the following code inside the while block to read messages from the queue.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex3-QueueReadingMessages)
public override void Run() { ... while (true) { Thread.Sleep(10000); Trace.TraceInformation("Working", "Information"); if (queue.Exists()) { var msg = queue.GetMessage(); if (msg != null) { Trace.TraceInformation(string.Format("Message '{0}' processed.", msg.AsString)); queue.DeleteMessage(msg); } } } }
Note: The worker process will try to get a message from the queue every 10 seconds using the GetMessage method. If there are messages in the queue, they will be shown them in the Compute Emulator log.
-
Press F5 to run the application. Once the browser has opened, upload a new image.
-
Open the Compute Emulator. To do so, right-click the Windows Azure icon tray and select Show Compute Emulator UI.
Windows Azure Tray Icon
-
Select the worker role instance. Wait until the process reads the message from the queue.
Worker role processing the queue
In this task, you will use Visual Studio to inspect the Windows Azure Storage Account.
-
If not already opened, open Visual Studio Express 2012 for Web.
-
Go to the View menu, and open Database Explorer.
-
In the Database Explorer pane, right-click Windows Azure Storage and select Add New Storage Account.
Database Explorer
-
Select Manually entered credentials and complete the Account name and Account key fields with the keys of the storage account you created in Exercise 1. Click OK.
Add New Storage Account
-
Expand the storage account you configured in the Database Explorer. Notice that there is an entry for Tables, Blobs and Queues.
-
Expand the Tables container. You will see the Photos table under it.
Photos Table in Database Explorer
-
Right-click the Photos table and select View Table.
Photos Table
Note: You can see the data you've created in the previous task. Notice the blob reference column. This column references the name of a blob storage.
-
Expand the Blobs container. Right-click the gallery blob and select View Blob Container.
Gallery Blob Container
Shared Access Signatures allow granular access to tables, queues, blob containers, and blobs. A SAS token can be configured to provide specific access rights, such as read, write, update, delete, etc. to a specific table, key range within a table, queue, blob, or blob container; for a specified time period or without any limit. The SAS token appears as part of the resource's URI as a series of query parameters.
In this exercise you will learn how to use Shared Access Signatures with the three storage abstractions: Tables, Blobs, and Queues.
#### Task 1 - Adding SAS at table level ####Note: This sample application does not follow all best practices for using Shared Access Signature for simplicity's sake. In a production environment you will typically have a service that generates the SAS for your application.
In this task you will learn how to create SAS for Azure tables. SAS for table allows owners to grant SAS token access by restricting the operations in several ways.
You can grant access to an entire table, to a table range (for example, to all the rows under a particular partition key), or some specific rows. Additionally, you can grant access rights to the specified table or table range such as Query, Add, Update, Delete or a combination of them. Finally, you can specify the SAS token access time.
-
Continue working with the end solution of the previous exercise or open the solution located at Source/Ex04-IntroducingSAS/Begin.
-
Open the PhotoEntity class, located in the Models folder.
-
Modify the default constructor to use "Public" as the partition key by default, and add an overloaded constructor that receives a partition key as a parameter.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-UpdatePhotoEntityConstructors)
public class PhotoEntity : TableEntity { public PhotoEntity() { PartitionKey = "Public"; RowKey = Guid.NewGuid().ToString(); } public PhotoEntity(string partitionKey) { PartitionKey = partitionKey; RowKey = Guid.NewGuid().ToString(); } ... }
-
Open the PhotoDataServiceContext.cs file and create a new method called GetSas.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-NewImplementationGetSasMethod)
public string GetSas(string partition, SharedAccessTablePermissions permissions) { SharedAccessTablePolicy policy = new SharedAccessTablePolicy() { SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15), Permissions = permissions }; string sasToken = this.ServiceClient.GetTableReference("Photos").GetSharedAccessSignature( policy /* access policy */, null /* access policy identifier */, partition /* start partition key */, null /* start row key */, partition /* end partition key */, null /* end row key */); return sasToken; }
Note: This method takes the partition and the permissions passed as parameters and creates a SAS for the Photos table. This SAS will grant the specified permissions only to the rows that correspond to that partition. Finally, it returns the SAS in string format.
-
Go to the controller folder and create a new base controller. To do so, right click in the controller folder, go to Add and select Controller....
-
Name the file BaseController and click Add.
BaseController creation
-
Remove the index method created by default.
-
Add the following using statements to the BaseController class.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-BaseControllerUsingStatements)
using Microsoft.WindowsAzure; using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Table; using PhotoUploader_WebRole.Models;
-
Add the following public properties to the BaseController.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-BaseControllerProperties)
public CloudStorageAccount StorageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString")); public Uri UriTable = new Uri("http://127.0.0.1:10002/devstoreaccount1"); public string AuthenticatedTableSas { get; set; } public string PublicTableSas { get; set; }
Note: Replace the http://127.0.0.1:10002/devstoreaccount1 with your storage account table URI in order to work against Windows Azure.
-
Override the OnActionExecuting method in the BaseController class.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-BaseControllerOnActionExecutingMethod)
protected override void OnActionExecuting(ActionExecutingContext filterContext)
{ CloudTableClient cloudTableClientAdmin = this.StorageAccount.CreateCloudTableClient(); var photoContextAdmin = new PhotoDataServiceContext(cloudTableClientAdmin);
if (this.User.Identity.IsAuthenticated)
{
this.AuthenticatedTableSas = photoContextAdmin.GetSas(this.User.Identity.Name, SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Delete | SharedAccessTablePermissions.Query | SharedAccessTablePermissions.Update);
this.PublicTableSas = photoContextAdmin.GetSas("Public", SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Delete | SharedAccessTablePermissions.Query | SharedAccessTablePermissions.Update);
}
else
{
this.PublicTableSas = photoContextAdmin.GetSas("Public", SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Update | SharedAccessTablePermissions.Query);
this.AuthenticatedTableSas = null;
}
} ````
>**Note:** The **OnActionExecuting** method is called everytime an action from the derived controller is called. Therefore this is the place where we will generate the SAS for table.
-
Open the HomeController class, located in the Controllers folder.
-
Update the HomeController in order to inheritate from BaseController and remove the StorageAccount property.
public class HomeController : BaseController { // // GET: / public ActionResult Index() { ... } ... }
-
Add the following using to the HomeController.
using Microsoft.WindowsAzure.Storage.Auth; using System.Collections.Generic;
-
Replace the Index action with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-HomeControllerIndexAction)
public ActionResult Index() { CloudTableClient cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(this.PublicTableSas)); var photoContext = new PhotoDataServiceContext(cloudTableClient); var photoList = new List<PhotoViewModel>(); var photos = photoContext.GetPhotos(); if (photos.Count() > 0) { photoList = photos.Select(x => this.ToViewModel(x)).ToList(); } var privatePhotos = new List<PhotoViewModel>(); if (this.User.Identity.IsAuthenticated) { cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(this.AuthenticatedTableSas)); photoContext = new PhotoDataServiceContext(cloudTableClient); photos = photoContext.GetPhotos(); if (photos.Count() > 0) { photoList.AddRange(photos.Select(x => this.ToViewModel(x)).ToList()); } } return this.View(photoList); }
-
Scroll down to the Details action and update the CloudTableClient creation method with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-NewCloudTableClientCall)
var token = partitionKey == "Public" ? this.PublicTableSas : this.AuthenticatedTableSas; CloudTableClient cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(token));
-
Repeate the previous step in the Edit and Delete GET actions and in the Delete Post action.
-
Locate the Create Post action and add new bool parameter called Public.
[HttpPost] public ActionResult Create(PhotoViewModel photoViewModel, HttpPostedFileBase file, bool Public, FormCollection collection) { ... }
-
Update the Create Post action and update the photoViewModel partitionKey with the following.
photoViewModel.PartitionKey = Public ? "Public" : this.User.Identity.Name;
-
Locate the CloudTableClient creation in the Create Post action and replace it with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-CreatePostActionUpdate)
var token = Public ? this.PublicTableSas : this.AuthenticatedTableSas; if (!this.User.Identity.IsAuthenticated) { token = this.PublicTableSas; photo.PartitionKey = "Public"; } CloudTableClient cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(token));
-
Scroll down to the Edit Post action and update the CloudTableClient creation with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-NewCloudTableClientCall-EditPost)
var token = photoViewModel.PartitionKey == "Public" ? this.PublicTableSas : this.AuthenticatedTableSas; CloudTableClient cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(token));
-
Create a new action called ToPublic and add the following code in its body. This method will delete a private blob (one created with a username as the partition key) and it will re-create it with "Public" as the partition key.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-ToPublicAction)
[HttpGet] public ActionResult ToPublic(string partitionKey, string rowKey) { CloudTableClient cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(this.AuthenticatedTableSas)); var photoContext = new PhotoDataServiceContext(cloudTableClient); var photo = photoContext.GetById(partitionKey, rowKey); if (photo == null) { return this.HttpNotFound(); } photoContext.DeletePhoto(photo); cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(this.PublicTableSas)); photoContext = new PhotoDataServiceContext(cloudTableClient); photo.PartitionKey = "Public"; photoContext.AddPhoto(photo); return RedirectToAction("Index"); }
-
In the same way, create a new action called ToPrivate, and add the following code in the method's body. As opposite to the ToPublic method, this one will remove the photo's row from the Public partition key and re-add it to the logged user partition. Therefore, this method needs a logged user to work.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-ToPrivateAction)
[HttpGet] public ActionResult ToPrivate(string partitionKey, string rowKey) { CloudTableClient cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(this.PublicTableSas)); var photoContext = new PhotoDataServiceContext(cloudTableClient); var photo = photoContext.GetById(partitionKey, rowKey); if (photo == null) { return this.HttpNotFound(); } photoContext.DeletePhoto(photo); cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(this.AuthenticatedTableSas)); photoContext = new PhotoDataServiceContext(cloudTableClient); photo.PartitionKey = this.User.Identity.Name; photoContext.AddPhoto(photo); return RedirectToAction("Index"); }
-
Open the Index.cshtml file, located in the Views/Home folder.
-
Locate the foreach statement, and add the following code at the end of the tr element. This code will add the To Public and To Private links next to each photo when corresponds.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-IndexViewUpdate)
@foreach (var photo in this.Model) { <tr> <td>@photo.PartitionKey</td> <td>@photo.Title</td> <td>@photo.Description</td> <td>@Html.ActionLink("Edit", "Edit", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> <td>@Html.ActionLink("Details", "Details", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> <td>@Html.ActionLink("Delete", "Delete", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> @if (photo.PartitionKey == "Public") { if (this.User.Identity.IsAuthenticated) { <td>@Html.ActionLink("To Private", "ToPrivate", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> } } else { <td>@Html.ActionLink("To Public", "ToPublic", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> } </tr> }
-
Open the Create.cshtml file, located at the Views/Home folder.
-
Add the following if statement before the input with type file element. This code adds a checkbox to upload the new photo as public if the user is authenticated. If it is not authenticated, the photo will be uploaded as public by default.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-CreateViewUpdate)
<div> <label>Image</label> <input type="file" name="file" id="file" /> </div> if (this.User.Identity.IsAuthenticated){ <div> @Html.CheckBox("Public", false) Public </div> } else { <div> @Html.CheckBox("Public", true, new { disabled = "disabled" }) Public </div> } <input type="submit" value="Create" />
-
Run the solution by pressing F5.
-
If you previously uploaded a photo, you will be able to see it listed; otherwise upload a new photo by clicking the Create link. The listed photos are public and can be seen by all application users, even if they are not authenticated.
Listing all the public photos
-
Log in to the application if you have already created a user, or else register a new one. After registration you will be automatically logged in.
-
Upload a new photo after being logged in. Notice that you are able to see the public photos and the private ones.
Listing public and private photos
-
Click the Log off button to log off. The page will be refreshed and you will not be able to see the photos uploaded as private anymore.
Private photos are not available when not authenticated
This is because when you log in a SAS is created to allow you to read and write to that user partition key. When you are not logged you have a SAS that only grants permission over the "Public" partition key, allowing you to read and write rows in that partition key.
In this task you will learn how to create SAS for Azure Blobs. SAS can be created for blobs and for blobs containers. SAS tokens can be used on blogs to read, update and delete the specified blob. Regarding blob containers, SAS tokens can be used to list the content of the container, and to create, read, update and delete blobs in it.
-
Open the PhotoDataServiceContext.cs file and add the following using.
using Microsoft.WindowsAzure.Storage.Blob; using System.Globalization;
-
Create a new method called GetSasForBlob. Paste the following code in the method's body.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-GetSasForBlobMethod)
public string GetSasForBlob(CloudBlockBlob blob, SharedAccessBlobPermissions permission) { var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy() { Permissions = permission, SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5), SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2), }); return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sas); }
Note: This method takes a block blob reference and creates a Blob SAS for it, with the permissions passed as parameters. Finally, it returns the SAS in string format.
-
Open the Index.cshtml located in the Views\Home folder, and add the following code, that adds the share link to private blobs, at the end of the else statement.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-IndexViewUpdateWithShareLink)
@if (photo.PartitionKey == "Public") { if (this.User.Identity.IsAuthenticated) { <td>@Html.ActionLink("To Private", "ToPrivate", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> } } else { <td>@Html.ActionLink("To Public", "ToPublic", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> <td>@Html.ActionLink("Share", "Share", new { partitionKey = photo.PartitionKey, rowKey = photo.RowKey })</td> }
Notice that the ActionLink is calling the Share action passing the partition and row keys as parameters.
-
Open the HomeController.cs file, located in the Controllers folder.
-
Create a new Share action in the HomeController class.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-ShareAction)
[HttpGet] public ActionResult Share(string partitionKey, string rowKey) { var cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(this.AuthenticatedTableSas)); var photoContext = new PhotoDataServiceContext(cloudTableClient); PhotoEntity photo = photoContext.GetById(partitionKey, rowKey); if (photo == null) { return this.HttpNotFound(); } string sas = string.Empty; if (!string.IsNullOrEmpty(photo.BlobReference)) { CloudBlockBlob blobBlockReference = this.GetBlobContainer().GetBlockBlobReference(photo.BlobReference); sas = photoContext.GetSasForBlob(blobBlockReference, SharedAccessBlobPermissions.Read); } if (!string.IsNullOrEmpty(sas)) { return View("Share", null, sas); } return RedirectToAction("Index"); }
The preceding code gets the blob reference by using the partition and row keys, and calls the GetSasForBlob method passing the reference and the permissions as parameters. In this case, the SAS is created with Read permissions.
-
You will now add the corresponding view to the previously created action. To do so, right click in the Home folder under Views, go to Add and select Existing Item....
-
Browse to the Assets/Ex4-IntroducingSAS folder, select the Share.cshtml view and click Add.
-
Run the solution by pressing F5.
-
Log into the application. If you do not have an user, register to create one.
-
If you previously uploaded some images using the account you used for login you can use them, otherwise, upload an image using the logged account.
-
Click the Share link, next to one of the uploaded photos. You will navigate to the Share page.
Generating a link to share a blob
-
Copy the provided link, and open it in your browser. You will be able to see the image from your browser.
Opening a shared blob
-
Wait two minutes (time it takes for this SAS token to expire) and press Ctrl+F5, as the token is no longer valid, you will not be able to see the image and an error will be displayed.
Opening an expired share link
In this task you will use SAS at queue level to restrict access to the storage queues. SAS can enable Read, Add, Process, and Update permissions on the queue.
-
In the QueueProcessor_WorkerRole project, open the WorkerRole class.
-
At the top of the class add the following using statements.
using Microsoft.WindowsAzure.Storage.Queue; using Microsoft.WindowsAzure.Storage.Auth;
-
Add the following class variables at the start of the class, that contain a reference to the Queue Uri and to the expiration time of the queue SAS token. Keep in mind that you can replace the local storage uri, with your Azure Queue Storage URL.
private DateTime serviceQueueSasExpiryTime; private Uri uri = new Uri("http://127.0.0.1:10001/devstoreaccount1");
-
Create a new private method called GetQueueSas, and the following code in its body.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-GetQueueSasUpdate)
private string GetQueueSas() { var storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString")); var client = storageAccount.CreateCloudQueueClient(); var queue = client.GetQueueReference("messagequeue"); queue.CreateIfNotExists(); var token = queue.GetSharedAccessSignature( new SharedAccessQueuePolicy() { Permissions = SharedAccessQueuePermissions.ProcessMessages | SharedAccessQueuePermissions.Read | SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.Update, SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15) }, null); this.serviceQueueSasExpiryTime = DateTime.UtcNow.AddMinutes(15); return token; }
This method gets a reference to the application's queue and generates a SAS token that has permissions to process, read, add, and update messages.
-
Browse to the Run method and replace its body with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-RunMethodUpdate)
public override void Run() { Trace.TraceInformation("QueueProcessor_WorkerRole entry point called", "Information"); var queueClient = new CloudQueueClient(this.uri, new StorageCredentials(this.GetQueueSas())); var queue = queueClient.GetQueueReference("messagequeue"); while (true) { Thread.Sleep(10000); Trace.TraceInformation("Working", "Information"); if (DateTime.UtcNow.AddMinutes(1) >= this.serviceQueueSasExpiryTime) { queueClient = new CloudQueueClient(this.uri, new StorageCredentials(this.GetQueueSas())); queue = queueClient.GetQueueReference("messagequeue"); } var msg = queue.GetMessage(); if (msg != null) { Trace.TraceInformation(string.Format("Message '{0}' processed.", msg.AsString)); queue.DeleteMessage(msg); } } }
-
Open the BaseController class located in the Controllers folder of the PhotoUploader_WebRole project and add the following directives.
using Microsoft.WindowsAzure.Storage.Queue;
-
Add the following public properties to the BaseController class.
public Uri UriQueue = new Uri("http://127.0.0.1:10001/devstoreaccount1"); public string QueueSas { get; set; }
Note: In order to work against Windows Azure, you should update the Uri queue with the one in azure.
-
Replace the if structure in the OnActionExecuting method with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-QueueSharedAccessSignature)
protected override void OnActionExecuting(ActionExecutingContext filterContext) { CloudTableClient cloudTableClientAdmin = this.StorageAccount.CreateCloudTableClient(); var photoContextAdmin = new PhotoDataServiceContext(cloudTableClientAdmin); if (this.User.Identity.IsAuthenticated) { this.AuthenticatedTableSas = photoContextAdmin.GetSas(this.User.Identity.Name, SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Delete | SharedAccessTablePermissions.Query | SharedAccessTablePermissions.Update); this.PublicTableSas = photoContextAdmin.GetSas("Public", SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Delete | SharedAccessTablePermissions.Query | SharedAccessTablePermissions.Update); this.QueueSas = this.StorageAccount.CreateCloudQueueClient().GetQueueReference("messagequeue").GetSharedAccessSignature( new SharedAccessQueuePolicy() { Permissions = SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.Read, SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15) }, null ); } else { this.PublicTableSas = photoContextAdmin.GetSas("Public", SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Update | SharedAccessTablePermissions.Query); this.AuthenticatedTableSas = null; this.QueueSas = null; } }
-
Open the HomeController, locate the GetCloudQueue method and replace the code with the following snippet.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex4-GetCloudQueueMethod)
private CloudQueue GetCloudQueue() { var queueClient = new CloudQueueClient(this.UriQueue, new StorageCredentials(this.QueueSas)); var queue = queueClient.GetQueueReference("messagequeue"); queue.CreateIfNotExists(); return queue; }
This code creates an instance of the CloudQueueClient class for the specified queue, using the created SAS, and then returns that instance.
-
Press F5 to run the application. Once the browser is opened, upload a new image.
-
Open the Compute Emulator. To do so, right-click the Windows Azure icon tray and select Show Compute Emulator UI.
Windows Azure Tray Icon
-
Select the worker role instance. Wait until the process reads the message from the queue, you should not see any messsages, because as an anonymous user you do not have permissions to add messages to the queue.
The queue receives no messages due to insufficient permissions
-
Log in the application, and upload a new photo. Wait until the process reads the message from the queue and shows the "Photo uploaded" message. As a logged user, you have a SAS with permissions to add messages to the queue.
As a logged user, messages will be added to the queue
Note: The create method will always try to add the message to the queue. However, when the user is not authenticated, its SAS does not have sufficient permissions to add messages to the message queue.
A stored access policy provides an additional level of control over Shared Access Signatures on the server side. Establishing a stored access policy serves to group Shared Access Signatures and to provide additional restrictions for signatures that are bound by the policy. You can use a stored access policy to change the start time, expiry time, or permissions for a signature, or to revoke it after it has been issued.
A stored access policy gives you greater control over Shared Access Signatures you have released. Instead of specifying the signature's lifetime and permissions on the URL, you can specify these parameters within the stored access policy stored on the blob, container, queue, or table that is being shared. To change these parameters for one or more signatures, you can modify the stored access policy, rather than reissuing the signatures. You can also quickly revoke the signature by modifying the stored access policy.
#### Task 1 - Updating table security to use stored access policy ####In this task you will update table security to use stored access signature.
-
Open the begin solution as administrator from \Source\Ex5-UpdatingSecurityStoredAccessSignature
Note: If you have completed exercise 4, you can continue working with that solution.
-
Update Global.asax.cs to set the stored access policies for table storage.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-TableStorageStoredAccessPoliciesTables)
protected void Application_Start()
{ ... CloudTable table = cloudTableClient.GetTableReference("Photos"); table.CreateIfNotExists();
TablePermissions tp = new TablePermissions();
tp.SharedAccessPolicies.Add("readonly", new SharedAccessTablePolicy { Permissions = SharedAccessTablePermissions.Query, SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(15) });
tp.SharedAccessPolicies.Add("edit", new SharedAccessTablePolicy { Permissions = SharedAccessTablePermissions.Query | SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Update, SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(15) });
tp.SharedAccessPolicies.Add("admin", new SharedAccessTablePolicy { Permissions = SharedAccessTablePermissions.Query | SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Update | SharedAccessTablePermissions.Delete, SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(15) });
tp.SharedAccessPolicies.Add("none", new SharedAccessTablePolicy { Permissions = SharedAccessTablePermissions.None, SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(15) });
table.SetPermissions(tp);
} ````
-
Open the PhotoDataServiceContext.cs class and replace the GetSas method with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-GetSasImplementation)
public class PhotoDataServiceContext : TableServiceContext { ... public string GetSas(string partition, string policyName) { string sasToken = this.ServiceClient.GetTableReference("Photos").GetSharedAccessSignature( new SharedAccessTablePolicy() /* access policy */, policyName /* access policy identifier */, partition /* start partition key */, null /* start row key */, partition /* end partition key */, null /* end row key */); return sasToken; } }
-
Open the BaseController.cs class and update the OnActionExecuting method with the new GetSas method implementation. To do so, replace the if structure code with the following snippet.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-GetTableSasOnActionExecuting)
protected override void OnActionExecuting(ActionExecutingContext filterContext) { ... if (this.User.Identity.IsAuthenticated) { this.AuthenticatedTableSas = photoContextAdmin.GetSas(this.User.Identity.Name, "admin"); this.PublicTableSas = photoContextAdmin.GetSas("Public", "admin"); this.QueueSas = this.StorageAccount.CreateCloudQueueClient().GetQueueReference("messagequeue").GetSharedAccessSignature( new SharedAccessQueuePolicy() { Permissions = SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.Read, SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15) }, null ); } else { this.PublicTableSas = photoContextAdmin.GetSas("Public", "edit"); this.AuthenticatedTableSas = null; this.QueueSas = null; } ... }
Note: Replace the http://127.0.0.1:10002/devstoreaccount1 with your storage account table URI in order to work against Windows Azure if not already replaced.
-
Open the Global.asax.cs class and add the following using statement.
using Microsoft.WindowsAzure.Storage.Blob;
-
Scroll down to the Application_Start method and set the stored access policies for blob storage. To do so, add the following code at the end of the method.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-BlobStorageStoredAccessPolicy)
protected void Application_Start() { ...
CloudBlobContainer blob = storageAccount.CreateCloudBlobClient().GetContainerReference(CloudConfigurationManager.GetSetting("ContainerName"));
BlobContainerPermissions bp = new BlobContainerPermissions();
bp.SharedAccessPolicies.Add("read", new SharedAccessBlobPolicy { Permissions = SharedAccessBlobPermissions.Read, SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(60) });
blob.SetPermissions(bp);
} ````
>**Note**: Replace the _<http://127.0.0.1:10002/devstoreaccount1>_ with your storage account table URI in order to work against Windows Azure if not already replaced.
-
Open the PhotoDataServiceContext.cs class and replace the GetSasForBlob method with the following implementation.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-GetSasForBlobWithStoredAccessPolicy)
public string GetSaSForBlob(CloudBlockBlob blob, string policyId) { var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy(), policyId); return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sas); }
-
Open the HomeController.cs class and scroll down to the Share method. Replace the GetSasForBlob method call with the new implementation.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-ShareActionWithStoredAccessPolicy)
[HttpGet] public ActionResult Share(string partitionKey, string rowKey) { ... string sas = string.Empty; if (!string.IsNullOrEmpty(photo.BlobReference)) { CloudBlockBlob blobBlockReference = this.GetBlobContainer().GetBlockBlobReference(photo.BlobReference); sas = photoContext.GetSaSForBlob(blobBlockReference, "read"); } if (!string.IsNullOrEmpty(sas)) { return View("Share", null, sas); } return RedirectToAction("Index"); }
-
Update the HomeController's Create method to call the new GetSasForBlob implementation. You will also add some properties and metadata to the blob file in order to check them in the worker role later.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-CreateActionWithPropertiesAndMetadata)
[HttpPost] public ActionResult Create(PhotoViewModel photoViewModel, HttpPostedFileBase file, bool Public, FormCollection collection) { if (this.ModelState.IsValid) { photoViewModel.PartitionKey = Public ? "Public" : this.User.Identity.Name; var photo = this.FromViewModel(photoViewModel); if (file != null) { //Save file stream to Blob Storage var blob = this.GetBlobContainer().GetBlockBlobReference(file.FileName); blob.Properties.ContentType = file.ContentType; var image = new System.Drawing.Bitmap(file.InputStream); if (image != null) { blob.Metadata.Add("Width", image.Width.ToString()); blob.Metadata.Add("Height", image.Height.ToString()); } blob.UploadFromStream(file.InputStream); photo.BlobReference = file.FileName; } else { this.ModelState.AddModelError("File", new ArgumentNullException("file")); return this.View(photoViewModel); } //Save information to Table Storage var token = Public ? this.PublicTableSas : this.AuthenticatedTableSas; if (!this.User.Identity.IsAuthenticated) { token = this.PublicTableSas; photo.PartitionKey = "Public"; } CloudTableClient cloudTableClient = new CloudTableClient(this.UriTable, new StorageCredentials(token)); var photoContext = new PhotoDataServiceContext(cloudTableClient); photoContext.AddPhoto(photo); try { //Send create notification var msg = new CloudQueueMessage(string.Format("Photo Uploaded,{0}", photo.BlobReference)); this.GetCloudQueue().AddMessage(msg); } catch (Exception e) { System.Diagnostics.Trace.TraceInformation("Error", "Couldn't send notification"); } return this.RedirectToAction("Index"); } return this.View(); }
-
Open the Global.asax.cs file and add the following using statements.
using Microsoft.WindowsAzure.Storage.Queue; using Microsoft.WindowsAzure.Storage.Queue.Protocol;
-
Update the Application_Start method to set the stored access policies for queues storage. You will also add a new metadata called resize and set it to true.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-QueueStorageWithStoredAccessPolicy)
protected void Application_Start() { ... CloudQueue queue = storageAccount.CreateCloudQueueClient().GetQueueReference("messagequeue"); queue.CreateIfNotExists(); QueuePermissions qp = new QueuePermissions(); qp.SharedAccessPolicies.Add("add", new SharedAccessQueuePolicy { Permissions = SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.Read, SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(15)}); qp.SharedAccessPolicies.Add("process", new SharedAccessQueuePolicy { Permissions = SharedAccessQueuePermissions.ProcessMessages | SharedAccessQueuePermissions.Read, SharedAccessExpiryTime = System.DateTime.UtcNow.AddMinutes(15) }); queue.SetPermissions(qp); queue.Metadata.Add("Resize", "true"); queue.SetMetadata(); }
Note: Replace the http://127.0.0.1:10002/devstoreaccount1 with your storage account table URI in order to work against Windows Azure if not already replaced.
-
Open the BaseController.cs class and locate the OnActionExecuting method. Replace the GetSharedAccessSignature method for queues with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-GetQueueSasWithStoredAccessPolicy)
protected override void OnActionExecuting(ActionExecutingContext filterContext) { ... this.QueueSas = this.StorageAccount.CreateCloudQueueClient().GetQueueReference("messagequeue").GetSharedAccessSignature( new SharedAccessQueuePolicy() { }, "add"); }
-
On the QueueProcessor_WorkerRole project, open the WorkerRole.cs class.
-
Add the following using statements.
using Microsoft.WindowsAzure.Storage.Blob; using Microsoft.WindowsAzure.Storage.Queue.Protocol;
-
Add the following member to the WorkerRole class to store the CloudBlobContainer
private CloudBlobContainer container;
-
Create a new method called CreateCloudBlobClient in order to create the set the container variable. To do so, insert the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-CreateCloudBlobClientImplementation)
private void CreateCloudBlobClient() { var storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString")); CloudBlobClient blobStorage = storageAccount.CreateCloudBlobClient(); this.container = blobStorage.GetContainerReference(CloudConfigurationManager.GetSetting("ContainerName")); }
-
In the OnStart method, call the CreateCloudBlobClient method you have recently created.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-CreateCloudBlobClientCall)
public override bool OnStart() { ... this.CreateCloudBlobClient(); return base.OnStart(); }
-
Scroll down to the GetQueueSas method. Replace the GetSharedAccessignature method with the following code.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-QueueSharedAccessSignatureWithStoredAccessPolicyInWorkerRole)
private string GetQueueSas() { var storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString")); var client = storageAccount.CreateCloudQueueClient(); var queue = client.GetQueueReference("messagequeue"); queue.CreateIfNotExists(); QueuePermissions qp = new QueuePermissions(); qp.SharedAccessPolicies.Add("process", new SharedAccessQueuePolicy { Permissions = SharedAccessQueuePermissions.ProcessMessages | SharedAccessQueuePermissions.Read, SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15) }); queue.SetPermissions(qp); var token = queue.GetSharedAccessSignature( new SharedAccessQueuePolicy(), "process"); this.serviceQueueSasExpiryTime = DateTime.UtcNow.AddMinutes(15); return token; }
-
Add the following code to the Run method in the WorkerRole class in order to display the properties and metadata saved in the WebRole. Place them inside the if block, at the beginning.
(Code Snippet - GettingStartedWindowsAzureStorage - Ex5-RunMethodUpdate)
public override void Run() { ... while (true) { ... if (msg != null) { queue.FetchAttributes(); var messageParts = msg.AsString.Split(new char[] { ',' }); var message = messageParts[0]; var blobReference = messageParts[1]; if (queue.Metadata.ContainsKey("Resize") && string.Equals(message, "Photo Uploaded")) { var maxSize = queue.Metadata["Resize"]; Trace.TraceInformation("Resize is configured"); CloudBlockBlob outputBlob = this.container.GetBlockBlobReference(blobReference); outputBlob.FetchAttributes(); Trace.TraceInformation(string.Format("Image ContentType: {0}", outputBlob.Properties.ContentType)); Trace.TraceInformation(string.Format("Image width: {0}", outputBlob.Metadata["Width"])); Trace.TraceInformation(string.Format("Image hieght: {0}", outputBlob.Metadata["Height"])); } Trace.TraceInformation(string.Format("Message '{0}' processed.", message)); queue.DeleteMessage(msg); } } }
-
Go the the Cloud project and right-click the QueueProcessor_WorkerRole role, located under the Roles folder and select Properties.
WorkerRole Properties
-
Click the Settings tab and add a new setting named ContainerName of String type and value gallery.
Settings tab
-
Press Ctrl + S to save the settings.
-
Press F5 to start debugging the solution.
Note: The Windows Azure Emulator should start.
-
Login to the application with the user you created in Exercise 3.
-
Click the Share link in one of the private photos you've uploaded before.
Sharing a photo with Stored Access Policy
Note: Notice how there's a new parameter in the query string named si that has the value read which is the Signed Identifier.
-
Go back to the Index view and click on Create.
-
Upload a new image of your choice.
-
Open the compute emulator and check how the Properties and Metadata are logged by the Worker Role.
Compute Emulator logs in worker role
## Summary ##
By completing this hands-on lab you have learned how to:
- Create a Storage Account.
- Enable Geo-Replication.
- Configure Monitoring metrics for your account.
- Configure Logging for each service.
- Consume Storage Services from a Web Application.