Say goodbye to key management – manage access to Azure Storage data using Azure AD

In the past months I’ve worked with many enterprise customers of Microsoft Azure to design their access control model for IaaS/PaaS management. One question that often comes up is – hey Dushyant, it’s great that we can now manage access to our cloud infrastructure using Active Directory and RBAC, but there are still these different storage keys that I need to manage separately. I’m worried that they will land up in an excel sheet and fall in the wrong hands. Also, our key roll-over process is ad-hoc and ex-employees could still have access to our production data. Can I manage access to Azure Storage data using AD too?

Every Azure subscription is associated with an Azure Active Directory. Only users and services from this directory can manage the subscription using Azure Management Portal and Command Line Management tools. One of the advantages of central access management is quick access revocation – access of the user/service can be revoked by simply disabling the account in the directory. However, “data plane” of some Azure resource providers is not secured using Azure AD, instead they use shared keys (long passwords) to auth data level access.

Use of API keys/passwords makes central access revocation difficult. Let’s look at how you can instead manage access to Azure Storage data using Azure AD: we will employ a new Azure service called Key Vault.

Azure Key Vault is a secure store for your cryptographic private keys, and secrets like shared keys and passwords. You can grant access to Azure Active Directory users and services to access the Key Vault and perform encrypt/decrypt operations on private keys or read/write secrets. Here’s how we’ll use a Key Vault to secure Azure Storage keys:

Secure Azure Storage Using Azure Key Vault and Azure AD

We will keep Azure Storage keys in a Key Vault. Every hour, an Azure Automation job will regenerate the keys and write them back to the Key Vault. We will then grant read access to Azure AD users and services to the Key Vault secret. When the user/service needs to access a Storage Account, it will retrieve its key from the Key Vault, cache it locally, and access the Storage Account. Within an hour that key will expire, and the user/service will read the new key from the Key Vault. When the user/service loses access to the Key Vault secret (their account is disabled or their access is revoked), within an hour they will loose access to the Storage Account data too within an hour. Voila!

Ok, let’s set this up

1. Setup Azure Key Vault

Refer to the Key Vault getting started topic to create a new Key Vault in the Azure subscription.

Note that access (to users and services) can only be granted on the entire Key Vault, not on individual secrets: if you store keys of two different Storage Accounts in the same Key Vault, and grant access to a user/service to read the Key Vault’ secrets, they will be able to access both Storage Accounts. So, if you wish to manage access to some Storage Accounts separately of others, store their keys in a different Key Vaults.

Create Key Vault

I created two key vaults: Infra-StorageKeys to hold keys of my Infrastructure related Storage Accounts and, PROD-Service-StorageKeys to hold keys of my production service related Storage Accounts.

2. Azure Automation Scheduled Job to Roll Keys

I used Azure Automation service to host my scheduled job: a PowerShell workflow that regenerates keys of the specified Storage Accounts in the subscription hourly, and writes them to the specified Key Vaults.

2.1 Create an identity for the Scheduled Job in Azure Active Directory

For the scheduled job, we must first create an application identity in the Azure Active Directory to which the Azure Subscription is associated. I registered a new application using the Azure Management portal.

1) Navigate to the Azure AD node on the left 2) Select the directory to which the subscription is associated 3) Select the Applications tab 4) Click on the Add action in the bottom bar 5) Specify the name of the application (I used ‘AzureKeysRollerDaemon’). Go with the type that’s selected by default (Web App/Web API) 6) On the next screen specify the Sign-On URL and App ID URI: as our scheduled job isn’t a Web App – these don’t matter (I used http://AzureKeysRollerDaemon for both.

create new Azure AD application for daemon service 1 create new Azure AD application for daemon service 2

Once the application is created, navigate to its Configure tab and create a key credential (I picked 2 year validity). Note the Client ID and the key credential – this will be used to configure the Azure Automation scheduled job.

create new Azure AD application for daemon service 3

2.2 Grant access to the identity of the Scheduled Job on Storage Accounts and Key Vaults

The scheduled job needs required permissions to manage keys of all Storage Accounts in the subscription and write secrets in the Key Vault.

Use Azure Resource Manager RBAC PowerShell to assign the Contributor role on the subscription to the identity of the scheduled job. This will allow it to regenerate keys of all the Storage Accounts in the subscription.

grant access to Scheduled Job identity on Azure Subscription

Then, grant secret management rights on the Key Vaults to the identity of the scheduled job. This will allow it to create and update secrets in the Key Vaults.

grant access to Scheduled Job identity on Key Vault

Ok, now the identity of the Scheduled Job has the permission to regenerate keys of all Storage Accounts in the subscription, and the permission to create and update secrets in the Key Vaults.

2.3 Setup Scheduled Job in Azure Automation

We now have everything we need to create a scheduled job, that will regenerate the keys of Storage Accounts in the subscription and write the new key value to the Key Vault.

If you do not already have an Azure Automation account, create one. I chose the name PROD-Automation.

Setup Automation Account 1

My Azure Automation scheduled job is made of 3 components: 1) the runbook, that holds the PowerShell workflow code of the scheduled job 2) assets of type variable, that hold the parameter the runbook needs to execute and 3) a schedule, also an asset – configured to execute the runbook every hour

We shall start by creating the variable assets – I used these to externalize my deployment specific settings from the runbook. Navigate to the Assets tab

Automation Variables 1

and create the following variables – make sure your variable names exactly match these:

Name

Type

Value

Encrypted

DirectoryDomainName string The domain name of your directory. e.g. aaddemo.com No
SubscriptionId string The id of your subscription e.g. c283fc76-9cd4-44c9-99a7-4ed71546436e No
AzureKeysRollerDaemon-ClientId string The client id of the Azure AD application that you created for the scheduled job sometime back. e.g. 5d82388c-3dfa-450b-91c5-20cb50f3e4bd No
AzureKeysRollerDaemon-ClientSecret string The credential of the Azure AD application that you created for the scheduled job sometime back. e.g. wdrQ8r6trbHREr7+7N8eGpptwHhtsY90ml8Jqx8HJJc= Yes
AzureKeysRollerDaemon-StorageAccountKeyVaultMapping string This variable holds the information about keys for what Storage Accounts should be regenerated and stored in what Key Vaults. The format of the string is:
<storage_account_name>,<storage_account_name>:<key_vault_name>;<storage_account_name>,<storage_account_name>:<key_vault_name>
e.g. prodservice,prodservicefiles,prodservicelogs:PROD-Service-StorageKeys;prodinfrastorage:Infra-StorageKeys
This tells the job to regenerate keys for prodservice, prodservicefiles, and prodservicelogs Storage Accounts and store them in the PROD-Service-StorageKeys Key Vault, and regenerate keys for prodinfrastorage Storage Account and store it in the Infra-StorageKeys Key Vault.
No

Automation Variables 2 Automation Variables 3

Automation Variables 4 Automation Variables 5 Automation Variables 6

Now, let’s create the runbook. My runbook should work for you out of the box – download it from here: https://raw.githubusercontent.com/dushyantgill/AzureResourceManagerPowerShell/master/Workflows/AzureKeysRollerDaemon.ps1. Navigate to the runbooks tab and import the downloaded PS1 file.

Setup Automation Runbook 1Setup Automation Runbook 2

Select edit runbook action in the success notification. Review the code and make sure I haven’t done anything stupid.

Setup Automation Runbook 3

Before you test the runbook – know that, this will regenerate keys of the Storage Accounts that you’re specified in the mapping. Make sure you don’t have a business critical application using any of these Storage Accounts. Hit Test once you’ve confirmed this and are good to go.

Setup Automation Runbook 4

If you’ve setup everything right, the test run should succeed. Now let’s publish the runbook and create a schedule for it.

Setup Automation Schedule 1

I am running the job on an hourly basis. What this means is that once I revoke a user’s access to the Key Vault, the user will lose access to the corresponding Storage Accounts within an hour. And when i disable the user’ account in the directory, the user will lose access to these Storage Accounts within couple hours (Azure AD access tokens are valid for an hour and the key regeneration job runs every hour).

Setup Automation Schedule 2

All set! Your runbook will now execute hourly and regenerate keys of the Storage Accounts in the subscription, and place the regenerated keys in the Key Vaults.

Setup Automation Done 1 Setup Automation Done 2

2.4 Azure Keys Roller Daemon Code

You must be curious about what the scheduled job code does. At a high level, it fetches the specified Storage Accounts the Azure Resource Manager (ARM) resources API. Then, using another ARM API it regenerates the primary and secondary keys (or key1 and key2 in the case of v2 Storage Accounts). It then writes the regenerated primary key (or key1) in the Key Vault as a secret, using a Key Vault API.

Disclaimer: this was my first attempt at writing a PowerShell workflow, and I chose the easiest (read inefficient) way to create one – dumped my entire PowerShell script in an InlineScript activity. The PowerShell (workflow) experts amongst you will want to modify the workflow to add more parallelization.

2.4.1 Acquire Tokens for Azure Resource Manager and Azure Key Vault

The workflow needs to call Azure Resource Manager and Azure Key Vault APIs. So it starts by getting access tokens for those resources. It uses the Azure AD OAuth client credential grant flow to the acquire tokens in the context of the application identity.

[code lang=”powershell”]
$armAccessToken = $null
$uri = "https://login.windows.net/" + $using:directory + "/oauth2/token"
$body = "grant_type=client_credentials"
$body += "&amp;client_id=" + $using:clientId
$body += "&amp;client_secret=" + [Uri]::EscapeDataString($using:clientSecret)
$body += "&amp;resource=" + [Uri]::EscapeDataString("https://management.core.windows.net/")
$headers = @{"Accept"="application/json"}
$enc = New-Object "System.Text.ASCIIEncoding"
$byteArray = $enc.GetBytes($body)
$contentLength = $byteArray.Length
$headers.Add("Content-Type","application/x-www-form-urlencoded")
$headers.Add("Content-Length",$contentLength)
$result = try { Invoke-RestMethod -Method POST -Uri $uri -Headers $headers -Body $body } catch { $_.Exception.Response }
$armAccessToken = $result.access_token
[/code]

2.4.2 Fetch Storage Accounts

The workflow then parses the AzureKeysRollerDaemon-StorageAccountKeyVaultMapping variable and fetches the specified storage accounts using the GET /Subscriptions/{sub_id}/resources API of Azure Resource Manager. It employs a filter to fetch both v1 and v2 Storage Accounts.

[code lang=”powershell”]
$storageAccountObjects = $null
$storageAccountsSearchSubFilter = ""
$count = 0
$storageAccounts | % {
$storageAccountsSearchSubFilter += [string]::Format("substringof(‘{0}’, name)", $_)
if($count -lt $storageAccounts.Length – 1) {$storageAccountsSearchSubFilter += " or "}
$count++
}
$uri = [string]::Format("https://management.azure.com/subscriptions/{0}/resources?api-version=2015-01-01&amp;`$filter=(resourceType eq ‘Microsoft.ClassicStorage/storageAccounts’ or resourceType eq ‘Microsoft.Storage/storageAccounts’) and ({1})", $using:subscriptionId, $storageAccountsSearchSubFilter)
$header = "Bearer " + $armAccessToken
$headers = @{"Authorization"=$header;"Content-Type"="application/json"}
$result = try { Invoke-RestMethod -Method GET -Uri $uri -Headers $headers } catch { $_.Exception.Response }
$storageAccountObjects = $result.value
[/code]

2.4.3 Regenerate Storage Account Keys

Next, for each Storage Account, the workflow regenerates the primary key and then the secondary key. The ARM API used to regenerate keys is POST /Subscriptions/{sub_id}/resourceGroups/{rg_id}/Providers/Microsoft.Storage/storageAccounts/{sa_id}/regenerateKey, or POST /Subscriptions/{sub_id}/resourceGroups/{rg_id}/Providers/Microsoft.ClassicStorage/storageAccounts/{sa_id}/regenerateKey depending on whether the Storage Account is of version 1 or version 2. The POST data is also slightly different depending on the version: for v1 {“keyType”: “Primary|Secondary”} for v2 {“keyName”: “key1|key2″}

[code lang=”powershell”]
if($_.type.Equals("microsoft.storage/storageAccounts",[StringComparison]::InvariantCultureIgnoreCase)){
$apiVer = "2015-05-01-preview"
$postData1 = "" | select KeyName
$postData1.KeyName = "key1"
$postData2 = "" | select KeyName
$postData2.Keyname = "key2"
}
else{
$apiVer = "2014-06-01"
$postData1 = "" | select KeyType
$postData1.KeyType = "Primary"
$postData2 = "" | select KeyType
$postData2.KeyType = "Secondary"
}

$uri = [string]::Format("https://management.azure.com{0}/regenerateKey?api-version={1}", $_.id, $apiVer)
$header = "Bearer " + $armAccessToken
$headers = @{"Authorization"=$header;"Content-Type"="application/json"}
$enc = New-Object "System.Text.ASCIIEncoding"
$body = ConvertTo-Json $postData1
$byteArray = $enc.GetBytes($body)
$contentLength = $byteArray.Length
$headers.Add("Content-Length",$contentLength)
$result = try { Invoke-RestMethod -Method POST -Uri $uri -Headers $headers -Body $body } catch { $_.Exception.Response }
[/code]

2.4.4 Write Storage Account Key to Key Vault

Finally, it writes the regenerated primary key (or key1) of the Storage Account to the Key Vault as a secret. It uses the storage account name as the Key Vault secret name. The Key Vault API is PUT {key_vault_name}.vault.azure.net/secrets/{secret_name}. The PUT either creates the secret or updates it.

[code lang=”powershell”]
$uri = [string]::Format("https://{0}.vault.azure.net/secrets/{1}?api-version=2014-12-08-preview", $storageAccountKeyVaultAssoc[$_.name], $_.name)
$postData = "" | select value
$postData.value = $newKey
$header = "Bearer " + $kvAccessToken
$headers = @{"Authorization"=$header;"Content-Type"="application/json"}
$enc = New-Object "System.Text.ASCIIEncoding"
$body = ConvertTo-Json $postData
$byteArray = $enc.GetBytes($body)
$contentLength = $byteArray.Length
$headers.Add("Content-Length",$contentLength)
$result = try { Invoke-RestMethod -Method PUT -Uri $uri -Headers $headers -Body $body } catch { $_.Exception.Response }
[/code]

3. Accessing these “auto key rolled” Storage Accounts

Now that the keys of these Storage Accounts have been made ephemeral, some of the methods of accessing the data in these Storage Accounts will no longer work. Let’s examine what ways will keep working and the ones that need to be tweaked.

3.1 Manage these Storage Accounts using Azure PowerShell as Admin User

Azure PowerShell cmdlets that manage Storage Account data require the AZURE_STORAGE_CONNECTION_STRING environment variable to be set as: “DefaultEndpointsProtocol=https;AccountName=<MyAccountName>;AccountKey=<MyAccountKey>”. The environment variable will need to be reset every time the Storage Account key gets regenerated.

Classic Service Admins and Co-Admins can use the Get-AzureStorageKey cmdlet to retrieve the Storage Account key and set the environment variable.

[code lang=”powershell”]
$env:AZURE_STORAGE_CONNECTION_STRING = "DefaultEndpointsProtocol=https;AccountName=prodservice;AccountKey=" + (Get-AzureStorageKey -StorageAccountName prodservice).Primary
[/code]

Access Storage Account Data as Admin using PowerShell 1 When the key gets regenerated, they will receive 403 Forbidden, and they can reset the environment variable using the same command.

Access Storage Account Data as Admin using PowerShell 2

3.2 Manage these Storage Accounts using Azure PowerShell as Non-Admin User

Users that aren’t Classic Service Admins or Co-Admins can also manage Storage Account data using Azure PowerShell as long as they know the Storage Account key. But now that the keys keep regenerating, non-admin users will need to be granted access to the Key Vault secret that holds the Storage Account key. So, they will use the Get-AzureKeyVaultSecret cmdlet to retrieve the Storage Account key (instead of Get-AzureStorageKey).

3.2.1 Grant access to user to read the Storage Account key from the Key Vault

First, the user must be granted access to read secrets from the Key Vault that holds the Storage Account key, using the  Set-AzureKeyVaultAccessPolicy command. Here I’m allowing John Yokum to read secrets from the PROD-Service-StorageKeys Key Vault.

[code lang=”powershell”]
Switch-AzureMode AzureResourceManager
Import-Module Z:\Scratch\KeyVaultManager
Set-AzureKeyVaultAccessPolicy -VaultName PROD-Service-StorageKeys -UserPrincipalName johny@aaddemo.com -PermissionsToSecrets list,get
[/code]

Grant access to user to read secrets from Key Vault

3.2.2 Access Storage Account as non-admin user by reading key from Key Vault

John Yokum does not have any direct access on the Storage Account, but he has been granted read access to the Storage Account key stored in the Key Vault. Here he uses Get-AzureKeyVaultSecret cmdlet to set the AZURE_STORAGE_CONNECTION_STRING environment variable and manages the Storage Account data using Azure PowerShell.

[code lang=”powershell”]
Switch-AzureMode AzureResourceManager
$env:AZURE_STORAGE_CONNECTION_STRING = "DefaultEndpointsProtocol=https;AccountName=prodservice;AccountKey=" + (Get-AzureKeyVaultSecret -VaultName PROD-Service-StorageKeys -Name prodservice).SecretValueText
Switch-AzureMode AzureServiceManagement
Get-AzureStorageContainer
[/code]

Access Storage Account Data as Non-Admin User using PowerShell 1

And when the key gets regenerated, John receives access denied error from Azure Storage, and he resets the connection string by retrieving the new key from te Key Vault.

Access Storage Account Data as Non-Admin User using PowerShell 2

3.2.3 Revoke user’s access to read the Storage Account key from the Key Vault

If John’s access to the Storage Account needs to be revoked, just revoke his access to the Key Vault that holds the Storage Account key, and within an hour John will lose access to the Storage Account data. Use the Remove-AzureKeyVaultAccessPolicy command to revoke access from KeyVault.

[code lang=”powershell”]
Switch-AzureMode AzureResourceManager
Import-Module Z:\Scratch\KeyVaultManager
Remove-AzureKeyVaultAccessPolicy -VaultName PROD-Service-StorageKeys -UserPrincipalName johny@aaddemo.com
[/code]

Revoke users access to read secrets from Key Vault

Access Storage Account Data as Non-Admin User using PowerShell 3

3.3 Manage these Storage Accounts using Azure Management Portal(s) and Visual Studio

There will be no change in how these Storage Accounts are managed via the management portals and Visual Studio. This is because these clients first retrieve the Storage Account key on behalf of the user and then use the key to access the data in the Storage Account. When the key is re-generated, these clients retrieve the new key.

Access Storage Account Data as Admin using Azure Management Portal 1 Access Storage Account Data as Admin using Azure Management Portal 2 Access Storage Account Data as Admin using Azure Management Portal 3 Access Storage Account Data as Admin using Visual Studio

Users in the classic subscription Service Administrator and Co-Administrators roles can keep managing these Storage Accounts using the classic portal, the new portal and Visual Studio server explorer. Users in the Owner, Contributor and Storage Account Contributor roles will continue to manage the Storage Accounts using the new portal.

3.4 Access these Storage Account from applications

Applications (cloud services, worker roles) use Azure Storage Accounts to store files and data. It is a common pattern to store the Storage Account key in the application’s config, and load it at application start-up. This pattern must change for Storage Accounts whose keys are regenerated on a regular basis. For such accounts, applications must retrieve the Storage Account key from the Key Vault and cache it. When the key gets regenerated, the application will receive HTTP403 (Forbidden/AccessDenied) from Azure Storage – then, the application must retrieve the regenerated key from the Key Vault.

The Azure Key Vault code samples will get you started on how to access Key Vault secrets from a Web Application.

4. Adding new Storage Accounts to the Key Roller Scheduled Job

As you add more Storage Accounts to the Azure subscription, just modify the AzureKeysRollerDaemon-StorageAccountKeyVaultMapping variable of the Azure Automation job and enable auto key roll for them. You won’t need to modify the code of the runbook.

enjoy!

 

4 thoughts on “Say goodbye to key management – manage access to Azure Storage data using Azure AD

  1. Marcel van den Dungen

    Hi Dushyant,

    Interesting article. It left me wondering if instead of rotating keys every hour, it wouldn’t be better to create a service handing out SAS tokens.

    Those tokens have a limited lifetime. You can revoke users’ access to the issuing service, preventing them from getting new tokens, but you can also limit the scope of what they can do to the storage account. And you would not need the scheduled job to rotate the storage keys.

    Any thoughts?

    1. dushyantgill Post author

      Marcel, yes – if you have a SAS token service that’s great for app access. That said, it is still a best practice to rotate your keys.

  2. Frank Bosma

    interesting article, but I have (at least :-)) 1 question: When the key(s) is (are) re-generated, does a VM needs a restart to keep access? Or, how does this work?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>