From A to Azure: Article 2 – Azure Storage

Azure Storage

In this article we will be diving into the many storage options that Azure provides and how to utilize them. Please reference the previous From A to Azure Article (1) to see all the necessary tools you will need for this lesson.

Github link to scripts used in this article

Types of Azure Storage

Azure Storage Account

An Azure Storage Account is the top level of storage in Azure which houses the various Azure storage options within.

These options are:

  • Blob Storage (Containers)
    • Stores files directly in a cloud hosted file repository, typically this was non-hierarchical and not able to be mounted to an OS but there is now options to allow hierarchical data and even directly mounting to the OS.
    • Most cost-effective data storage
    • <storageaccountname>.blob.core.windows.net
  • File Storage (File Shares)
    • Stores files directly in a cloud hosted network drive like a Network Attached Storage (NAS) device. By default, can be mounted directly to an OS and used like a network storage device.
    • <storageaccountname>.file.core.windows.net
  • Table Storage (Tables)
    • Azure Table storage is a service that stores non-relational structured data (also known as structured NoSQL data) in the cloud, providing a key/attribute store with a schemaless design.
    • You can use Table storage to store flexible datasets like user data for web applications, address books, device information, or other types of metadata your service requires. You can store any number of entities in a table, and a storage account may contain any number of tables, up to the capacity limit of the storage account.
    • <storageaccountname>.table.core.windows.net
  • Queue Storage (Queues)
    • Stores data in an ordered list to act as a messaging service between application. Supports millions of messages with a max queue data size of 64kb. Queues are commonly used to create a backlog of work to process asynchronously.
    • <storageaccountname>.queue.core.windows.net

Creating a Public Access Blob Storage Account, Container, & Upload a file (Portal)

  • Navigate to the Resource Group you wish to create the Storage Account in.
    • Click the Create button on the top bar

    • Search the Marketplace for Storage Account and press Enter, then select Storage Account

    • Select Create

    • Select your currently existing Resource Group
    • Name your Storage Account
      • The field can contain only lowercase letters and numbers.
      • Name must be between 3 and 24 characters.
    • Select a region
    • Select a performance tier
      • Typically you will go with Standard unless you need high performance, in which case you would select Premium though do note that the cost is significantly higher.
    • Select a Redundancy, for this we will select Locally-redundant Storage (LRS) as it is the most cost effective.

    • Select Next: Advanced

    • Leave Require secure transfer for REST API operations
      • This will require HTTPS connections which is more secure.
    • Leave Enable infrastructure encryption unchecked, the double encryption is not required unless complying with specific regulations or security standards.
    • Check Enable blob public access as we are creating a public repository with this storage account.
    • Check Default to Azure Active Directory authorization in the Azure Portal to allow Azure based Role Based Access Control (RBAC).
    • Check Enable hierarchical namespace to enable blob storage containers to utilize folders
    • Leave Enable network file share v3 unchecked, this would require users to connect via particular virtual networks, since this is supposed to be public access, this would not work.
    • Leave Access Tier set to Hot, cool is primarily reserved for infrequently accessed data such as backups
    • Leave Enable large file shared unchecked, as this storage account should not be storing over 100TB of data.
    • Select Next: Networking

    • Select Public endpoint (All Networks)
    • Select Next: Data Protection

    • Uncheck both Enable soft delete for containers & file shares
      • Typically we would want these checked for production environments as they would prevent data loss even in the instance of deleting the storage container or file share, retaining them in Azure for 7 days post deletion, allowing for restoration.
      • Since this is a testing/demo storage account, we want the ability to delete it immediately
    • Select Review + Create

    • Once Validation is complete, select Create
  • Navigate to your new Storage Account
    • Select Containers blade on the left side menu
    • Select + Container on the top menu
    • Name the container
      • This name may only contain lowercase letters, numbers, and hyphens, and must begin with a letter or a number. Each hyphen must be preceded and followed by a non-hyphen character. The name must also be between 3 and 63 characters long.
    • Set the Public Access Level to Container to allow full access to the container to the public
    • Select Create

    • Select the newly created blob Container

    • You will notice an error claiming that You do not have permissions to list the data using your user account with Azure AD.
      • Select Access Control (IAM)

        • If you select View my access you will notice your account is listed as an Owner this is not enough for container data permissions.
        • Select + Add on the top bar
        • Select Add Role Assignment
        • Select Storage Blob Data Owner for Role
        • Repeat steps to also add Storage Blob Data Contributor
          • Data Read/Write access must be explicitly allowed for Data Containers when using Azure AD authentication.
        • Select your own user account
        • Select Save
        • Select the Overview blade on the left side, you should no longer see the authentication error
      • Select Add Directory
        • Name your directory
        • Select Save

        • Navigate to your new directory by clicking on it in the container Overview
        • Select Upload on the top menu bar
          • Select a sample file and click Save

  • Select the newly uploaded file
  • You will see a URL as well as properties and the ability to download the file directly from the portal
  • Copy the URL and input it into a web browser, the file will either display (if text/pdf) or download.

Creating a Public Access Blob Storage Account, Container, & Upload a file (PowerShell)

For the sake of simplicity, we will not be enabling the Azure AD authentication on the PowerShell method of creating a Storage Account as it requires looking up many different IDs. If you would like deeper documentation on creating a Storage Account with all options please refer to the Microsoft Documentation: https://docs.microsoft.com/en-us/powershell/module/az.storage/new-azstorageaccount

  • Create the Storage Account (Replace values with your information)

## Create PowerShell object with all settings necessary for the Storage Account
$saParams = @{
ResourceGroupName = ‘RG_Name’
Name = ‘storageacccountname’
Location = ‘location’
SkuName = ‘Standard_LRS’
Kind = ‘StorageV2’
AccessTier = ‘Hot’
AllowBlobPublicAccess = $true
EnableHttpsTrafficOnly = $true
EnableHierarchicalNamespace = $true
MinimumTlsVersion = ‘TLS1_2’
}

## Create Storage
$newSA1 = New-AzStorageAccount @saParams

## Github link to Storage Account creation script
  • Create the Blob Storage Container (Replace values with your information)
## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName ='RG_Name'
Name = 'storageaccountname'
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set parameters for new Storage Container
$saContainerParams = @{
Name = 'containername'
Permission = 'Container'
Context = $StorageAccountContext
}

## Create a new storage container within the Storage Account context
New-AzStorageContainer @saContainerParams

## Github link to Blob Storage container creation script
  • Upload a file to the container (Replace values with your information)
## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName ='RG_Name'
Name = 'storageaccountname'
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set parameters for the container
## Blob represents the path/name of the file when uploaded
$ContainerUpload =@{
Container = 'riviapscontainer'
File = 'C:\Rivia\Sample Document.txt'
Blob = 'Documents\Sample.txt'
Context = $StorageAccountContext
}

## Upload the file to the container
Set-AzStorageBlobContent @ContainerUpload

## Github link to upload file script

Congratulations, you have just created a public repository that anonymous users can access/download files from!

Create a File Storage Container & Mount It (Portal)

  • Navigate to your existing Storage Account
  • Select File Shares blade on the left side menu
  • Select + File Share
  • Set a name
    • File share names can contain only lowercase letters, numbers, and hyphens, and must begin and end with a letter or a number. The name cannot contain two consecutive hyphens.
  • Set the tier to Hot for this test
  • Select Create

  • Select your new File Share
    • Select Connect on the top menu bar
    • Select the appropriate OS and Drive Letter
    • Leave the Authentication Method set to Storage Account Key
      • This will auto generate a key within the script below
    • Copy the PowerShell Script in the section below
    • Run the PowerShell Script on the machine you wish to mount this file share to
      • Note – You will need to have Port 445 open on the machine for this to work, some ISP’s may block this port.
      • For this example, I have mounted the File Share on an Azure VM

Create a File Storage Container (PowerShell)

  • Create File Share (Replace values with your information)
## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName ='Rivia'
Name = 'sariviaps'
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set parameters of the File Share
## Currently lacking some parameter options that were available in the previous AzureRM, such as QuotaGiB and AccessTeir
$FileShareParams = @{
Name = 'riviafsps'
Context = $StorageAccountContext
}

## Create the File Share
New-AzStorageShare @FileShareParams

## Github link to File Share creation script
  • Utilize the portal as described above to generate the connection script.

Create a Storage Queue (Portal)

  • Navigate to your existing Storage Account
    • Select Queues blade from the left side menu
    • Select + Queuefrom the top menu bar
    • Name your Table and select OK

In order to edit the queue we will need to programmatically edit it or utilize the Azure Storage Explorer, which we will utilize further down this article.

Add/View Storage Queue Message (Portal)

  • Navigate into your newly created Storage Queue
    • Select + Add Message on the top menu bar
    • Input message text and select OK

You can now see and access the message in the queue

Create a Storage Queue (PowerShell)

  • Create Storage Queue (Replace values with your information)
## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName ='Rivia'
Name = 'sariviaps'
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set parameters of the Storage Queue
$TableParams = @{
Name = 'riviaqueueps'
Context = $StorageAccountContext
}

## Create the Storage Queue
New-AzStorageQueue @TableParams

## Github link to Storage Queue creation script

Add Storage Queue Message (PowerShell)

  • Create new Storage Queue Message (Replace values with your information)
## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName ='Rivia'
Name = 'sariviaps'
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set Queue parameters
$QueueParams = @{
Name = 'riviaqueueps'
Context = $StorageAccountContext
}

## Load the Queue into context
$queue = Get-AzStorageQueue @QueueParams

## Create a new message object
$queueMessage = [Microsoft.Azure.Storage.Queue.CloudQueueMessage]::new("Hello world! - PowerShell")

## Submit the message object to the queue
$queue.CloudQueue.AddMessageAsync($QueueMessage)

## Github link to add Storage Queue message script

View/Delete Storage Queue Message (PowerShell)

Messages are read in best-try first-in-first-out order. This is not guaranteed. When you read the message from the queue, it becomes invisible to all other processes looking at the queue. This ensures that if your code fails to process the message due to hardware or software failure, another instance of your code can get the same message and try again.

This invisibility timeout defines how long the message remains invisible before it is available again for processing. The default is 30 seconds.

Your code reads a message from the queue in two steps. When you call the Microsoft.Azure.Storage.Queue.CloudQueue.GetMessage method, you get the next message in the queue. A message returned from GetMessage becomes invisible to any other code reading messages from this queue. To finish removing the message from the queue, you call the Microsoft.Azure.Storage.Queue.CloudQueue.DeleteMessage method.

In the following example, you read through the two queue messages, then wait 10 seconds (the invisibility timeout). Then you read the three messages again, deleting the messages after reading them by calling DeleteMessage. If you try to read the queue after the messages are deleted, $queueMessage will be returned as $null.

  • Read the first two messages with a timeout value of 10 seconds
    • The messages will become unavailable for processing for 10 seconds before being available for processing again, this prevents applications from overwriting or processing the same message twice.

## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName =’Rivia’
Name = ‘sariviaps’
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set Queue parameters
$QueueParams = @{
Name = ‘riviaqueueps’
Context = $StorageAccountContext
}

## Load the Queue into context
$queue = Get-AzStorageQueue @QueueParams

# Set the amount of time you want to entry to be invisible after read from the queue
# If it is not deleted by the end of this time, it will show up in the queue again
$invisibleTimeout = [System.TimeSpan]::FromSeconds(10)

# Read the message from the queue, then show the contents of the message. Read the other two messages, too.
$queueMessage = $queue.CloudQueue.GetMessageAsync($invisibleTimeout,$null,$null)
$queueMessage.Result
$queueMessage = $queue.CloudQueue.GetMessageAsync($invisibleTimeout,$null,$null)
$queueMessage.Result

# After 10 seconds, these messages reappear on the queue.

## Github link to read the first to Storage Queue messages script

  • Read the first two messages with a timeout value of 10 seconds then delete them

## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName =’Rivia’
Name = ‘sariviaps’
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set Queue parameters
$QueueParams = @{
Name = ‘riviaqueueps’
Context = $StorageAccountContext
}

## Load the Queue into context
$queue = Get-AzStorageQueue @QueueParams

# Read the messages, but delete each one after reading it.
# Delete the message.
$queueMessage = $queue.CloudQueue.GetMessageAsync($invisibleTimeout,$null,$null)
$queueMessage.Result
$queue.CloudQueue.DeleteMessageAsync($queueMessage.Result.Id,$queueMessage.Result.popReceipt)

$queueMessage = $queue.CloudQueue.GetMessageAsync($invisibleTimeout,$null,$null)
$queueMessage.Result
$queue.CloudQueue.DeleteMessageAsync($queueMessage.Result.Id,$queueMessage.Result.popReceipt)

## Github link to read and delete first two Storage Queue messages

Ideal Usage of Queues

  • Functions that check queues for new messages to process commands based on the message then delete the queue message
    • Names, Dates, IDs, etc…

Create a Storage Table (Portal)

  • Navigate to your existing Storage Account
    • Select Tables blade from the left side menu
    • Select + Table from the top menu bar
    • Name your Table and select OK

In order to edit the table we will need to programmatically edit it or utilize the Azure Storage Explorer, which we will utilize further down this article.

Create a Storage Table (PowerShell)

  • Create Storage Table (Replace values with your information)
    • In order to edit the table we will need to programmatically edit it or utilize the Azure Storage Explorer, which we will utilize further down this article.

## Set Parameters to get the existing storage account
$saParams = @{
ResourceGroupName =’Rivia’
Name = ‘sariviaps’
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

## Set parameters of the Storage Table
$TableParams = @{
Name = ‘riviatableps’
Context = $StorageAccountContext
}

## Create the Storage Table
New-AzStorageTable @TableParams

## Github link to create Storage Table script

Edit a Storage Table (PowerShell)

Install the AzTable Module

We will need an additional PowerShell module to work with Tables

Install AzTable

## Github link to install AzTable script

Install-Module AzTable

To perform operations on a table using AzTable, you need a reference to CloudTable property of a specific table.

  • Load Table Context

$saParams = @{
ResourceGroupName =’Rivia’
Name = ‘sariviaps’
}

## Get the context of the existing Storage Account
$StorageAccount= Get-AzStorageAccount @saParams
$StorageAccountContext = $StorageAccount.Context

$TableParams = @{
Name = ‘riviatable’
Context = $StorageAccountContext
}

$cloudTable = (Get-AzStorageTable @TableParams).CloudTable

## Github link to load Table context script

  • Add Entities to the Table
    • The combination of BOTH PartitionKey and RowKey must be unique
      • You may have the same PartitionKey OR RowKey, but not the same of both
$partitionKey1 = "partition1"
$partitionKey2 = "partition2"

# add four rows 
Add-AzTableRow `
-table $cloudTable `
-partitionKey $partitionKey1 `
-rowKey ("CA") -property @{"username"="Chris";"userid"=1}

Add-AzTableRow `
-table $cloudTable `
-partitionKey $partitionKey2 `
-rowKey ("NM") -property @{"username"="Jessie";"userid"=2}

Add-AzTableRow `
-table $cloudTable `
-partitionKey $partitionKey1 `
-rowKey ("WA") -property @{"username"="Christine";"userid"=3}

Add-AzTableRow `
-table $cloudTable `
-partitionKey $partitionKey2 `
-rowKey ("TX") -property @{"username"="Steven";"userid"=4}

## Github link to create Table entities script
  • Query all the Table’s values
Get-AzTableRow -table $cloudTable | ft

userid username PartitionKey RowKey TableTimestamp Etag
------ -------- ------------ ------ -------------- ----
1 Chris partition1 CA 8/22/2021 5:01:33 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A33.4025412Z'"
3 Christine partition1 WA 8/22/2021 5:01:33 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A33.7968228Z'"
2 Jessie partition2 NM 8/22/2021 5:01:33 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A33.6066868Z'"
4 Steven partition2 TX 8/22/2021 5:01:34 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A34.0229836Z'"

## Github link to query Table script
  • Query a Table’s specific partition values
Get-AzTableRow -table $cloudTable -partitionKey $partitionKey1 | ft

userid username PartitionKey RowKey TableTimestamp Etag
------ -------- ------------ ------ -------------- ----
1 Chris partition1 CA 8/22/2021 5:01:33 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A33.4025412Z'"
3 Christine partition1 WA 8/22/2021 5:01:33 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A33.7968228Z'"

## Github link to query Table script
  • Query a Table’s specific value within a specific column
Get-AzTableRow -table $cloudTable `
-columnName "username" `
-value "Chris" `
-operator Equal


userid : 1
username : Chris
PartitionKey : partition1
RowKey : CA
TableTimestamp : 8/22/2021 5:01:33 PM -04:00
Etag : W/"datetime'2021-08-22T21%3A01%3A33.4025412Z'"

## Github link to query Table script
  • Query a Table’s entities with a custom filter
Get-AzTableRow `
-table $cloudTable `
-customFilter "(userid eq 1)"
userid : 1
username : Chris
PartitionKey : partition1
RowKey : CA
TableTimestamp : 8/22/2021 5:01:33 PM -04:00
Etag : W/"datetime'2021-08-22T21%3A01%3A33.4025412Z'"

## Github link to query Table script
  • Updating a Table’s specific entity
    • There are three steps for updating entities. First, retrieve the entity to change. Second, make the change. Third, commit the change using Update-AzTableRow.

Update the entity with username = ‘Jessie’ to have username = ‘Jessie2’. This example also shows another way to create a custom filter using .NET types.

# Create a filter and get the entity to be updated.
[string]$filter = `
[Microsoft.Azure.Cosmos.Table.TableQuery]::GenerateFilterCondition("username",`
[Microsoft.Azure.Cosmos.Table.QueryComparisons]::Equal,"Jessie")
$user = Get-AzTableRow `
-table $cloudTable `
-customFilter $filter

# Change the entity.
$user.username = "Jessie2"

# To commit the change, pipe the updated record into the update cmdlet.
$user | Update-AzTableRow -table $cloudTable


Result : Microsoft.Azure.Cosmos.Table.DynamicTableEntity
HttpStatusCode : 204
Etag : W/"datetime'2021-08-22T21%3A09%3A52.6255931Z'"
SessionToken :
RequestCharge :
ActivityId :


# To see the new record, query the table.
Get-AzTableRow -table $cloudTable `
-customFilter "(username eq 'Jessie2')"


userid : 2
username : Jessie2
PartitionKey : partition2
RowKey : NM
TableTimestamp : 8/22/2021 5:09:52 PM -04:00
Etag : W/"datetime'2021-08-22T21%3A09%3A52.6255931Z'"

## Github link to update Table entity
  • Deleting a Table’s specific entity
# Set filter.
[string]$filter = `
[Microsoft.Azure.Cosmos.Table.TableQuery]::GenerateFilterCondition("username",`
[Microsoft.Azure.Cosmos.Table.QueryComparisons]::Equal,"Jessie2")

# Retrieve entity to be deleted, then pipe it into the remove cmdlet.
$userToDelete = Get-AzTableRow `
-table $cloudTable `
-customFilter $filter
$userToDelete | Remove-AzTableRow -table $cloudTable


Result : Microsoft.Azure.Cosmos.Table.TableEntity
HttpStatusCode : 204
Etag :
SessionToken :
RequestCharge :
ActivityId :


# Retrieve entities from table and see that Jessie2 has been deleted.
Get-AzTableRow -table $cloudTable | ft

userid username PartitionKey RowKey TableTimestamp Etag
------ -------- ------------ ------ -------------- ----
1 Chris partition1 CA 8/22/2021 5:01:33 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A33.4025412Z'"
3 Christine partition1 WA 8/22/2021 5:01:33 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A33.7968228Z'"
4 Steven partition2 TX 8/22/2021 5:01:34 PM -04:00 W/"datetime'2021-08-22T21%3A01%3A34.0229836Z'"

## Github link to delete Table entity script
  • Delete all Entities in a Table
# Get all rows and pipe the result into the remove cmdlet.

Get-AzTableRow `    -table $cloudTable | Remove-AzTableRow -table $cloudTable  

Result         : Microsoft.Azure.Cosmos.Table.TableEntityHttpStatusCode : 204Etag           :SessionToken   :RequestCharge  :ActivityId     : Result         : Microsoft.Azure.Cosmos.Table.TableEntityHttpStatusCode : 204Etag           :SessionToken   :RequestCharge  :ActivityId     : Result         : Microsoft.Azure.Cosmos.Table.TableEntityHttpStatusCode : 204Etag           : SessionToken   :RequestCharge  :ActivityId     :    

# List entities in the table (there won't be any).

Get-AzTableRow -table $cloudTable | ft

## Github link to delete Table entity script

Delete a Table

Remove-AzStorageTable @TableParams # Retrieve the list of tables to verify the table has been removed.Get-AzStorageTable –Context $StorageAccountContext | select $Name
  • Delete a Table

Remove-AzStorageTable @TableParams

# Retrieve the list of tables to verify the table has been removed.
Get-AzStorageTable –Context $StorageAccountContext | select $Name

## Github link to delete Table entity script

Azure Storage Explorer

The Azure Storage Explorer is Azure’s all in one solution for managing storage accounts and the associated resources within. You may directly edit blobs, files, tables, and queues.

  • Open the Azure Storage Explorer
  • Select the User icon on the left side menu
    (You may already have your accounts showing here, we will assume they are not and add them)

    • Select Add an Account
    • Select Subscription to make things simple and connect to all resources

    • Select the proper Azure environment to connect to, in this instance standard Azure

    • Your browser will open and populate a login page, log in with your user account with access to the correct subscription

    • You will now see a connected account

Select the Resources icon on the top left of the menu bar

  • Drop down the Subscription, Storage Accounts, then the newly created Storage Account
  • Blob Containers and File Shares allow you to create and edit folders, and files

  • Queues allow you to queue and dequeue messages

  • Tables allow you to add/edit/delete Columns (Properties) and entries as well as execute queries
    • The combination of BOTH PartitionKey and RowKey must be unique
      • You may have the same PartitionKey OR RowKey, but not the same of both
    • You may import from a CSV or export your table to CSV

Summary

We reviewed the many different options that Azure has for Storage, such as Blob storage, File Shares, Queues, and Tables. With this knowledge you should be able to spin up a storage solution for everything from anonymous web download repositories, to OS mounted file shares, and even cloud native progr

Related Articles

Responses

Your email address will not be published. Required fields are marked *