Most of the time you'll see examples and tutorials online of accessing Azure Blob Storage programmatically using the master storage account key(s), or generating SAS keys and using those instead. While this certainly works, it does have some drawbacks:
- The master storage key gives far more access than is needed (in most cases)
- If a master storage key is compromised and you regenerate it, all SAS keys that were created off of that master key are now invalid and must be recreated
It turns out there's a better way to do it! Azure Blob Storage now supports the use of RBAC to control access. You can do this with a regular Azure AD user as well, but for the purposes of this post, we will create a Service Principal and show how to use that. The benefit of going this route is you never have to give out any keys to your storage account, and revoking access is as simple as removing the roles/permissions assigned to the particular service principal!
Here's a high level list of the steps to be performed:
(Note: You'll have to be a Global Administrator in your Azure account to do some of this!)
- Create a service principal
- Create a resource group
- Create a storage account with few containers in your storage account
- Create two custom RBAC roles - one which allows only READ access to containers, and one which allows WRITE access to containers
- Assign the roles with the appropriate permissions scopes to your service principal record
- Show how to access those resources via the az CLI
- Show how to access those resources via C#
Create the service principal via az CLI:
(Replace "YOUR_SERVICE_PRINCIPAL_NAME" with the name you want to use)
az ad sp create-for-rbac -n "YOUR_SERVICE_PRINCIPAL_NAME" --skip-assignment
This command will output some values that are important to note - make sure you save off the "PASSWORD" and "APPLICATION_ID" values from the output!
Create the resource group via az CLI:
(Replace "YOUR_RESOURCE_GROUP_NAME" with the name you want to use)
az group create -l eastus -n YOUR_RESOURCE_GROUP_NAME
Create the storage account and some containers via az CLI:
(Replace "YOUR_RESOURCE_GROUP_NAME" with the name of your created resource group, and "YOUR_STORAGE_ACCOUNT_NAME" with the name you want to use)
az storage account create -n YOUR_STORAGE_ACCOUNT_NAME -g YOUR_RESOURCE_GROUP_NAME -l eastus --sku STANDARD_LRS
az storage container create -n readonly --account-name YOUR_STORAGE_ACCOUNT_NAME
az storage container create -n writeonly --account-name YOUR_STORAGE_ACCOUNT_NAME
Create the custom RBAC roles via az CLI:
You can technically skip this part and just use some of the core Azure RBAC roles for assignment if you want. However, in my case I wanted to make one container read only (i.e., you can only view and download the blobs, but not put anything there), and one container write only (you can only write a new blob there, not read or download anything). I didn't find a core role for the "write only" option so decided to make my own custom RBAC roles for both of them.
First you'll want to create two files, "storage-reader-role-definition.json" and "storage-writer-role-definition.json" with the following contents:
storage-reader-role-definition.json
(Replace "YOUR_SUBSCRIPTION_ID" with the id of your Azure subscription)
{
"Name": "custom-blob-storage-reader",
"IsCustom": true,
"Description": "Ability to list and download blobs from a given container",
"Actions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/read"
],
"NotActions": [],
"DataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read"
],
"NotDataActions": [],
"AssignableScopes": [
"/subscriptions/YOUR_SUBSCRIPTION_ID"
]
}
storage-writer-role-definition.json
(Replace "YOUR_SUBSCRIPTION_ID" with the id of your Azure subscription)
{
"Name": "custom-blob-storage-writer",
"IsCustom": true,
"Description": "Ability to write blobs to a given container",
"Actions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/read"
],
"NotActions": [],
"DataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action"
],
"NotDataActions": [],
"AssignableScopes": [
"/subscriptions/YOUR_SUBSCRIPTION_ID"
]
}
Now you can use the az CLI to create those custom roles in your Azure AD tenant:
az role definition create --role-definition "storage-reader-role-definition.json"
az role definition create --role-definition "storage-writer-role-definition.json"
Run a quick check to ensure your roles were successfully created:
az role definition list --custom-role-only true
Assign roles with appropriate permissions scopes to service principal record
Now we can assign these roles, with the appropriate permission scopes, to our service principal account. We want to assign the "storage reader" one with access to our "readonly" container we created, and we want to assign the "storage writer" one with access to our "writeonly" container we created.
az role assignment create --role "custom-blob-storage-reader" --assignee "YOUR_SERVICE_PRINCIPAL_APPLICATION_ID" --scope "/subscriptions/YOUR_SUBSCRIPTION_ID/resourceGroups/YOUR_RESOURCE_GROUP_NAME/providers/Microsoft.Storage/storageAccounts/YOUR_STORAGE_ACCOUNT_NAME/blobServices/default/containers/readonly"
az role assignment create --role "custom-blob-storage-writer" --assignee "YOUR_SERVICE_PRINCIPAL_APPLICATION_ID" --scope "/subscriptions/YOUR_SUBSCRIPTION_ID/resourceGroups/YOUR_RESOURCE_GROUP_NAME/providers/Microsoft.Storage/storageAccounts/YOUR_STORAGE_ACCOUNT_NAME/blobServices/default/containers/writeonly"
Login and access storage resources via the az CLI:
We're now going to check and make sure we can login with our service principal credentials and are able to access the resources as we expect.
Before starting the rest of the test, manually upload an empty file called "testreadonly.txt" to the "readonly" container in your storage account.
First we need to login. When using service principals (instead of a general Azure AD user record), there is no "dynamic" UI login. You can only login by specifying the credentials to the az login command - so let's do that:
Replace the"YOUR_SERVICE_PRINCIPAL_CLIENT_ID" value with the "APPLICATION_ID" you obtained from the output of the create-for-rbac command. Replace the "YOUR_SERVICE_PRINCIPAL_CLIENT_SECRET" value with the "PASSWORD" value you obtained from the create-for-rbac command. And lastly replace "YOUR_TENANT_ID" with your appropriate Azure AD tenant ID as well.
az login --service-principal --username YOUR_SERVICE_PRINCIPAL_CLIENT_ID --password YOUR_SERVICE_PRINCIPAL_CLIENT_SECRET --tenant YOUR_TENANT_ID
We should now be logged in as our SP. Let's test out our access and ensure our custom roles and permission scopes are working as expected:
NOTE: For all scenarios below, replace "YOUR_STORAGE_ACCOUNT_NAME" with your storage account name created above.
1) Try to list files in the "readonly" container, and try to download them as well. Both of these operations should work. Note that the "--auth-mode login" parameter in these (and all subsequent) commands - that is needed to tell the CLI to use the context of our currently logged in user (the SP).
az storage blob list --account-name YOUR_STORAGE_ACCOUNT_NAME --container readonly --auth-mode login
az storage blob download --account-name YOUR_STORAGE_ACCOUNT_NAME --container readonly --name "testreadonly.txt" --file "C:\path\to\file\testreadonly-downloaded.txt" --auth-mode login
2) Try to upload a file to the "writeonly" container. This operation should work.
az storage blob upload --account-name YOUR_STORAGE_ACCOUNT_NAME --container writeonly --name "testwriteonly.txt" --file "C:\path\to\file\testwriteonly.txt" --auth-mode login
3) Try to upload a file to the "readonly" container. This operation should fail.
az storage blob upload --account-name YOUR_STORAGE_ACCOUNT_NAME --container readonly --name "testwriteonly.txt" --file "C:\path\to\file\testwriteonly.txt" --auth-mode login
4) Try to download a file from the "writeonly" container. This operation should fail.
az storage blob download --account-name YOUR_STORAGE_ACCOUNT_NAME --container writeonly --name "testwriteonly.txt" --file "C:\path\to\file\testwriteonly-downloaded.txt" --auth-mode login
Access storage resources with a service principal via C#:
The CLI access method is fine if you want to just want to use this as a manual process, or perhaps as a schedule task. But you may want to have a background service access and authenticate against Azure storage using the SP as well. This is also very easy to do utilizing some of the Microsoft NuGet libraries.
I'm doing this with the .NET Framework, but the libraries to do so should also be available for .NET Core/.NET Standard.
Install the following NuGet libraries into your solution:
- WindowsAzure.Storage@9.3.3
- Microsoft.IdentityModel.Clients.ActiveDirectory@5.1.1
Note that below you will want to fill in your own values in the "Fields" section with the values appropriate to your use case.
And then here's the code!
namespace AzureStorageTest
{
using System;
using System.Threading.Tasks;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
public static class Program
{
#region Fields
private const string TenantID = "YOUR_AZURE_TENANT_ID";
private const string StorageAccountName = "YOUR_STORAGE_ACCOUNT_NAME";
private const string ClientID = "YOUR_SERVICE_PRINCIPAL_CLIENT_ID";
private const string ClientSecret = "YOUR_SERVICE_PRINCIPAL_CLIENT_SECRET";
#endregion
#region Methods
public static void Main(string[] args)
{
// container to iterate over
var containerName = "readonly";
Task.Run(async () =>
{
var token = await Program.GetAccessToken();
TokenCredential tokenCredential = new TokenCredential(token);
StorageCredentials storageCredentials = new StorageCredentials(tokenCredential);
CloudBlobClient client = new CloudBlobClient(new Uri($"https://{Program.StorageAccountName}.blob.core.windows.net"), storageCredentials);
CloudBlobContainer container = client.GetContainerReference(containerName);
foreach (var blob in container.ListBlobs())
{
Console.WriteLine(blob.StorageUri.PrimaryUri.ToString());
}
}).Wait();
Console.WriteLine("Program Completed");
Console.ReadKey();
}
private static async Task GetAccessToken()
{
var authContext = new AuthenticationContext($"https://login.windows.net/{Program.TenantID}");
var credential = new ClientCredential(Program.ClientID, Program.ClientSecret);
var result = await authContext.AcquireTokenAsync("https://storage.azure.com", credential);
if (result == null)
{
throw new Exception("Failed to authenticate via ADAL");
}
return result.AccessToken;
}
#endregion
}
}
Hopefully this helps get you started on securing your Azure Blob Storage with RBAC instead of hard-coded keys or SAS keys!
For more resources on RBAC access in Azure - check out the following links:
- Manage access to Azure resources using RBAC and the Azure portal
- Built-in roles for Azure resources
- Grant access to Azure blob and queue data with RBAC in the Azure portal
- Authenticate with Azure Active Directory from an application for access to blobs and queues
Thanks,
Justin