DEV Community

AdamWhite
AdamWhite

Posted on • Originally published at codewithadam.com

Generating Sas Tokens using Azure Managed Identity (User Delegation)

It's possible with Azure Blob Storage to generate a Shared Access Signature (SAS) which you can allow any third party limited access to a blob. This access can be limited in time and actions such as Read, Write, or more to a specific file held within blob storage. You can also provide access to the entire blob container if you wish.

There is a new way to generate in the new Blob Storage SDK, and the big thing here is the ability to generate those SAS tokens without a storage account key.

User Delegation SAS

The typical way to generate a SAS token in code requires the storage account key. This assumes you have the storage account key, and there are scenarios where you just won't have access to that. This is the situation I found myself in. You'll find many answers online, but none of them lead you down the right path if you are as unlucky as I was.
If you need to use "Managed Identity" to control access to your storage accounts in code, which is something I highly recommend wherever possible as this is a security best practice.
In this scenario, you won't have a storage account key, so you'll need to find another way to generate the shared access signatures.

To do that, we need to use an approach called "user delegation" SAS . By using a user delegation SAS, we can sign the signature with the Azure Ad credentials instead of the storage account key.

I'll show you below exactly what code you need to generate the user delegation SAS URI with the .Net storage SDK; I'll also cover a few gotchas that caught me out and how to test this within visual studio locally.

Generating a User Delegation SAS

Connecting to Azure Storage using Azure Active Directory Credentials is made incredibly easy thanks to the DefaultAzureCredential. This helper class tries a variety of different techniques to source the credentials required to access a storage account.

Firstly it checks for environment variables. If these aren't present, it attempts to use a managed identity (this is what you want in production!). Should that fail, it has a range of fallback options that it will try, and these are great for local development. It can use the credentials you logged into visual studio, Visual Studio (VS) Code, Azure CLI with. So in a range of development environments, this will work. I'll show you below how to set the managed identity you want to use in Visual Studio, which can be the one you've logged in with or something else entirely.

Here's how you would use the DefaultAzureCredential to create a BlobServiceClient:

var storageAccountName  = "ducksandbadgersstore";
var storageAccountUriString = $"https://{storageAccountName}.blob.core.windows.net";
var credential = new DefaultAzureCredential();
var blobServiceClient = new BlobServiceClient(new Uri(storageAccountUriString), credential);
Enter fullscreen mode Exit fullscreen mode

With that, you've successfully created a blob service client using managed identity. You can test this by uploading a file into the storage account.

var blobContainerClient = blobServiceClient.GetBlobContainerClient("duckcontainer");
var blobClient = blobContainerClient.GetBlobClient("duck.txt");
if(!await blobClient.ExistsAsync())
{
    using var ms = new MemoryStream(Encoding.UTF8.GetBytes("This is my secret blob"));
    await blobClient.UploadAsync(ms);
}
Enter fullscreen mode Exit fullscreen mode

All being well, you should have a file in your blob storage container. Which proves that managed identity is working.

Now let's generate a shared access signature. The first start is to create a user delegation key.

The duration of this SAS token can only be set to a maximum of 7 days Otherwise, you'll get an error if you request a longer duration. You also get an error if you mess up the dates, I sent in the same start and end date of "Now" due to a config issue.
When that happens, you'll see an error with an HTTP status code of 400 (bad request) with the error being "The value for one of the XML nodes is not in the correct format," which isn't the most helpful error. If you get this, check the values, make sure they make sense! Asking for a SAS token that immediately expires isn't sensible!

To get the user delegation key is this simple:

var userDelegationKey = await blobServiceClient
    .GetUserDelegationKeyAsync(DateTimeOffset.UtcNow, 
                               DateTimeOffset.UtcNow.AddDays(7));
Enter fullscreen mode Exit fullscreen mode

We can use the user delegation key with the BlobSasBuilder and BlobUriBuilder helpers to generate the SAS token URI. You can access the file or provide a token allowing someone write access to upload a file to your container.
In the example below, I'm asking for a SAS token that's valid for 7 days for a specific file.
Do not that the SAS token doesn't need to have the same lifetime as the user delegation key, but it cannot be longer If you attempt to create a SAS token URI with a lifespan that's longer than the lifespan of the user delegation key, you will get a 403 error response.

var sasBuilder = new BlobSasBuilder()
{
    BlobContainerName = blobClient.BlobContainerName, // duckcontainer from above
    BlobName = blobClient.Name, // duck.text from above
    Resource = "b", // b for blob, c for container
    StartsOn = DateTimeOffset.UtcNow,
    ExpiresOn = DateTimeOffset.UtcNow.AddDays(7),
};

sasBuilder.SetPermissions(BlobSasPermissions.Read |
                        BlobSasPermissions.Write); // read and write permissions

var blobUriBuilder = new BlobUriBuilder(blobClient.Uri)
{
    Sas = sasBuilder.ToSasQueryParameters(userDelegationKey,
                                        blobServiceClient.AccountName)
};

var sasUri = blobUriBuilder.ToUri();
Enter fullscreen mode Exit fullscreen mode

The sasUri can be used to download the file until the SAS token expires or the user delegation key expires. Whichever happens first will invalidate this SAS token.

To test the SAS token URI, here's a simple bit of code that will download the file's contents.

var httpClient = new HttpClient();
try
{
    var blobContentsString = await httpClient.GetStringAsync(sasUri).ConfigureAwait(false);
    Console.WriteLine(blobContentsString);
}
catch (HttpRequestException e)
{
    Console.WriteLine("Sas token failed - Unable to download: " + e.Message);
}

Enter fullscreen mode Exit fullscreen mode

Testing Locally

If you attempt to test this locally or use a service in Azure, you may find that this doesn't work. There's a reason for that.
You need to give the identity accessing the storage account some RBAC permissions.

  • Storage Account Contributor
  • Storage Blob Data Contributor

This may surprise you, as being the owner of the storage account isn't sufficient.

You can test this in Azure CLI and even see if you can tighten the permissions by attempting the following.

$ACCOUNT_NAME = "ducksandbadgersstore"
$CONTAINER_NAME = "duckcontainer"

# use this to test if you have the correct permissions
az storage blob exists --account-name $ACCOUNT_NAME `
                        --container-name $CONTAINER_NAME `
                        --name duck.txt --auth-mode login
Enter fullscreen mode Exit fullscreen mode

If you haven't assigned the right RBAC permissions within the storage account, the above will fail. Head on into Azure, go to your storage account, then Access Control (IAM), then role assignments to set the permissions. Take a look at this article to see the various ways to grant the relevant RBAC permissions.

I personally do this within the CLI myself, and it's this simple.
Firstly look you our Azure AD object ID using our email address:

$EMAIL_ADDRESS = 'adam@redturtlesoftware.com'
$OBJECT_ID = az ad user list --query "[?mail=='$EMAIL_ADDRESS'].objectId" -o tsv
Enter fullscreen mode Exit fullscreen mode

Now we need the id of the storage account to set the RBAC permissions on:

$STORAGE_ID = az storage account show -n $ACCOUNT_NAME --query id -o tsv
Enter fullscreen mode Exit fullscreen mode

This will return a string that contains the subscriptionId, resource group, and the storage account name.
For example:/subscriptions/770476dd-69ac-465d-96cc-gh12bc676chk/resourceGroups/badger-rg/providers/Microsoft.Storage/storageAccounts/ducksandbadgersstore

We can add the RBAC permissions Storage Blob Data Contributor role scoped to this storage account container with this information.

az role assignment create `
    --role "Storage Blob Data Contributor" `
    --assignee $OBJECT_ID `
    --scope "$STORAGE_ID/blobServices/default/containers/$CONTAINER_NAME"
Enter fullscreen mode Exit fullscreen mode

If this still doesn't work for you, there is a gotcha to get around when working within visual studio.
The DefaultAzureCredential may not select the correct Azure AD tenant id. There are two ways to get around this. Firstly you can set it in code like this:

var azureCredentialOptions = new DefaultAzureCredentialOptions();
azureCredentialOptions.VisualStudioTenantId = "5546dcdg-6581-66f0-a200-da76560045433s";
var credential = new DefaultAzureCredential(azureCredentialOptions);
Enter fullscreen mode Exit fullscreen mode

Setting Visual Studio's Managed Identity

Secondly, you can go into Visual Studio, go to Tools -> Options. Then Azure Service Authentication. It's here where you can select an account. Or even add another account. I mentioned above about using an account that isn't specifically the one you are logged into Visual Studio with.
This is the settings section you would use to set another managed identity to use.

Summary

Managed Identities can sound a bit scary to those who haven't used them, but actually, they are incredibly simple to use and end up making code much simpler and more secure.
In the past, I've had to manage load balancing of storage queues and passing in storage account keys by using Azure Key Vault. It's just a complication you can get rid of.

On top of that, Managed Identities are a much more secure way for you to access your cloud resources, giving a fine grained control as to what can be done to those resources, an account key gives complete access, RBAC gives you the ability to use the principle of least privilege.
Sure, when it comes to generating SAS tokens, there is an additional hoop to jump through, but it's still less of a hassle than setting up KeyVault and passing the secret into the code.

Hopefully, the steps above have shown you how to set the correct RBAC roles to your local user or managed identity. The C# example code is enough to help you generate that user delegation key, which is the key to SAS token generation with managed identity.

You are limited to a lifetime of 7 days with this approach, but that's a good thing. The longer something is open, the more likely it will be attacked. It's best practice to only have your SAS tokens alive for the shortest amount of time necessary.

Top comments (0)