Here’s how to add a message to a Storage queue, including the security mechanisms appropriate to test, development and the Valet Key pattern.
In my previous post, I covered how to configure and secure a queue in a storage account. In this post, I’m going to cover how to add messages to the queue from a server-side or client-side frontend.
For my server-side frontend, I’m assuming that you’ll authorize access to your queue using a Managed Identity. For the client-side frontend, you’ll authorize using an Application Registration.
At the end of this post, I’ll also look at the code to authorize your application using either:
For my server-side example using a Managed Identity, I’ll cover both the authorization code and the code for adding a message to a Storage queue (I covered configuring a queue to use a Managed Identity in my previous post on configuring the Storage queue).
For my client-side example using an App Registration, I’m going to skip over the code for claiming an Application Registration (I covered that in an part 9a of this series. In this post, for the client-side frontend, I’ll just cover the code for adding a message to a Storage queue from a client-side.
To add a message to a Storage queue, in either a client-side or server-side frontend, you’ll need the URL for the queue. You can get that by surfing to your storage account and, from the menu down the left side, selecting Data Storage | Queues. The list of queues on the right displays the URL for each of your queues—you just need to copy the URL for the queue you want your frontend to add messages to.
The message that I’m going to add to my queue includes an array of products required for the transaction being processed, a correlation id associated with the business transaction and a string field (to support basically anything else I need to add to the message). To support adding all of that in one message, I created a Data Transfer Object (DTO) to hold the data to be added to the queue.
In C#, for my server-side code, that DTO looks like this:
public class QueueDTO
{
public IList<Product> Products { get; set; }
public string CorrelationId { get; set; } = string.Empty;
public string msg {get; set; } = string.Empty;
}
In TypeScript, for my client-side code, my DTO looks like this:
type QueueDTO =
{
Products: Product[],
CorrelationId: string,
Msg: string
};
In a server-side ASP.NET Core frontend, you first need to add the Azure.Storage.Queues
NuGet package to your project (if you’re using Visual Studio’s Manage NuGet Packages page, search for the package using “azure storage queues”).
My next step was to use the following code to create my QueueDTO
object and set its properties. Since only strings can be written to a Storage queue, I then converted my DTO into a string using the .NET JSON Serializer:
QueueDTO qdto = new()
{
Products = productList,
CorrelationId = correlationId
};
string qMsg = JsonSerializer.Serialize<QueueDTO>(qdto);
With my message created, the next step to create a QueueClient
to access the queue. The first step in that process is to create a Uri
object from the URL for the queue. Here’s the code for the queue in my case study:
Uri qURL = new(
"https://warehousemgmtproducts.queue.core.windows.net/updateproductinventory"
);
The next step is to create the credentials that will authorize adding messages to your queue. You could use (as I have in previous posts), the DefaultAzureCredentials
object. The DefaultAzureCredentials
object will try to authorize the request to add a message to your queue using “whatever means necessary.”
However, because I want my application to always use the Managed Identity I’ve assigned to the App Service, I can be more specific and use instead the ManagedIdentityCredential
object.
The ManagedIdentityCredential
object needs to be passed the client id of the Managed Identity that is a) assigned to the App Service and b) has been assigned a role that supports adding messages to the Storage queue. To get that identity’s client id:
You can then paste that client id into code like this to create a ManagedIdentityCredential
at run time:
ManagedIdentityCredential qMI = new ("<Managed Identity Client Id>");
My next step isn’t necessary, but Microsoft’s documentation tells me it will improve interoperability for my message. I created a QueueClientOptions
object and used it to specify that my message was to use Base64 encoding.
QueueClientOptions qOpts = new()
{
MessageEncoding = QueueMessageEncoding.Base64,
};
With all that in place, you can now create a QueueClient
object that is tied to your Storage queue and has permission to add messages to the queue. Just instantiate a QueueClient
object, passing the:
Uri
object that holds your queue’s URLManagedIdentityCredential
object that ties to the Managed Identity that has the necessary permissionsWith the QueueClient
object created, you can then use its SendMessageAsync
method to add a message to the queue, passing message as JSON string to the method. Here’s all that code:
QueueClient qc = new ( qURL, qMI, qOpts);
await qc.SendMessageAsync(qMsg);
One warning before you start testing your code: If you’ve only recently assigned your Managed Identity to either your App Service or your Storage queue, don’t rush to publish your frontend to your App Service so that you can try it out. It can take a few minutes for identity’s permissions to propagate. Go get a coffee.
In your client-side app, in addition to writing the code to add your message to your queue, you must configure your queue’s CORS settings to accept requests from your client-side app.
To configure your storage account’s CORS settings so that your queue will accept a request from your client-side TypeScript/JavaScript frontend’s domain, you’ll need the URL for your client-side frontend. The simplest (and most reliable) way to get that URL is to run your frontend and copy the URL for it out of your browser’s address bar.
Once you have that URL, surf to your storage account and:
Save your changes by clicking the Save icon in the menu bar at the top of the page.
With your storage account’s CORS settings configured, you’re almost read to start writing code. First add the @azure/storage-queue
package to your frontend application with this command:
npm install @azure/storage-queue
My next step was to create my DTO object, set its properties to the values I wanted and convert the DTO into a JSON string:
let qDTO:QueueDTO =
{
Products: productsList,
CorrelationId: correlationid,
Msg: ""
};
const qMsg:string = JSON.stringify(qDTO);
To get the permissions necessary to access your app, you’ll need an InteractiveBrowserCredential
, tied to your App Registration. You must pass the InteractiveBrowserCredential
the Application (client) ID and Tenant ID from your frontend’s App Registration. That code will look something like this:
const qCred:InteractiveBrowserCredential = new InteractiveBrowserCredential(
{
clientId: "d11…-…-…-…-…040"
tenantId: "e98…-…-…-…-…461"
}
);
You can now create a QueueClient
object, passing the full URL for your queue (you can get that from the list of queues in your storage account) and your InteractiveBrowserCredential
. Once you’ve created the QueueClient
object, you can use its sendMessage
method, passing your DTO as a JSON string, to add a message to your queue (the method is asynchronous so you should use the await
keyword with it):
const qc:QueueClient = new QueueClient(
"https://warehousemgmtproducts.queue.core.windows.net/updateproductinventory",
qCred
);
await qc.sendMessage(qMsg);
With that code written, you’re ready to deploy your code to your App Service and test it.
There is a simpler method to give your frontend (client-side or server-side) access to your queue then using an App Registration or Managed Identity: a connection string.
Using a connection string gives your application unfettered access to your queue. In production, you’ll probably prefer the more restricted access that you can create using Managed Identities or App Registrations (in conjunction with user permissions). However, for testing functionality or for proof-of-concept code, you may prefer to use a connection string. Be aware, though, that you can only use a connection string in TypeScript/JavaScript code running in the Node.js environment (i.e., on the server, in an App Service).
You can only use a connection string if you left “Enable storage account key access” turned on when you created your storage account.
If you have turned that option off, you can re-enable it in your storage account: Surf to your storage account, select the Settings | Configuration choice in the menu down the left side. Find the “Allow storage account key access” option, set it to Enabled.
The first step in using a connection string is to retrieve one of the automatically generated connection strings for your storage account:
To use your connection string in a server-side application in C#, instantiate the QueueClient
object, passing that copied connection string and your queue name as strings. (Be sure to flag the connection string with the @
flag. The Access Key embedded in your connection string may contain forward slashes and, without the @
flag, C# will treat those slashes as escape keys.)
Typical code will look like this:
QueueClient qc = new (@"<connection string>",
"<queue name>");
You can also create a QueueClient
by passing it a StorageSharedKeyCredential
created with just the Access Key portion of your connection string’s URL (I’ll show that code in the section on creating a Valet Key).
Again: If you’re going to use a connection string in a production application (and you shouldn’t), you should keep the string in a Key Vault. Even with your connection string in the Key Vault, you should consider retrieving the values from Key Vault through an App Service’s environment variables.
In TypeScript or JavaScript, you can pass your connection string to the QueueServiceClient
class’s static fromConnectionString method
. That method will return a QueueClient
object authorized to access the Queue.
You can then use that QueueClient
object’s sendMessage
method to add your message to the queue:
const qsc:QueueServiceClient =
QueueServiceClient.fromConnectionString("<connection string>");
const qc:QueueClient = qsc.getQueueClient("updateproductinventory");
await qc.sendMessage(qMsg);
For the Valet Key pattern, I’m going to assume that you’ll use a server-side resource to generate an SAS that you will then pass to a client (server-side or client-side). The client will then pass the SAS to a QueueClient
object to access the queue.
You have three options for generating an SAS in server-side code:
Your storage account will, in Settings | Configuration, need to have the “Allow storage account key access” option enabled. If you’re accessing the queue using a Managed Identity, that Managed Identity must have the Storage Queue Delegator role assigned to it.
To directly generate an SAS, first create a QueueSasBuilder
object, passing it:
This code creates a builder object that grants permission to add messages for the next two minutes and ties the SAS to the updateproductinventory
queue:
QueueSasBuilder qSASBldr = new( QueueSasPermissions.Add,
DateTimeOffset.UtcNow.AddMinutes(2) )
{
QueueName = " updateproductinventory "
};
You must next create a QueueClient
object using a StorageSharedKeyCredential
. You can create the key credential by passing it the name of the storage account and one of the keys from your storage account’s Security + networking | Access keys page. Once you have the credential key, you can pass it, along with the URL for your queue wrapped in a Uri
object, to create your QueueClient
:
StorageSharedKeyCredential sskc = new("updateproductinventory ",
"<account access key>");
QueueClient qcc = new(new Uri(<url for the queue>), sskc);
Once you have the QueueClient
created, you can generate the SAS (wrapped in a Uri
object) by calling the QueueClient
’s GenerateSasUri
method, passing the builder object:
Uri sasUri = qc.GenerateSasUri(qSASBldr);
The client would then use the returned URL to create its own QueueClient
object.
Once again: The application should be retrieving the Access Key from the Key Vault and, ideally, after being redirected through one of the App Service’s environment variables.
Instead of setting your start/expiry times and permissions in code, you can use an Access Policy to specify time and allowed operations. You can generate an Access Policy in your server-side code and use that in-memory Access Policy to generate an SAS. However, that method requires the same inputs as generating an SAS without a policy (see the previous section) and requires more code.
Using Access Policies makes more sense if you’re leveraging an Access Policy assigned to the queue at design time. A URL created from an Access Policy already assigned to the queue contains no restrictions—the URL just includes the name of the specified Access Policy. This strategy enables you to change the permissions being granted by your application by updating the policy in the Azure Portal rather than rewriting and redeploying your application (including deleting the policy if you want to disable any clients from using it).
To generate an SAS from an Access Policy, just create a qSasBuilder
object and set its Identifier
property to the name of a policy assigned to the queue. After that, pass that builder object to a QueueClient
object’s GenerateSasUri
method, which will return an SAS/URL (wrapped inside a Uri
object) that you can then pass to the client. Again, the QueueClient
object will have to have been created using a StorageSharedKeyCredential
object (see above).
This code creates an SAS that uses an Access Policy name:
QueueSasBuilder qSASBldr = new(QueueSasPermissions.Add,
DateTimeOffset.UtcNow.AddMinutes(2))
{
QueueName = "sasqueue"
};
Uri sasUri = qcc.GenerateSasUri(qSASBldr);
The client would then use the returned URL to create its own QueueClient
object with the permissions specified in the Access Policy.
While I’ve haven’t taken advantage of it here, when you add a message to a queue you can specify both when a message becomes visible to your backend processor and how long a message will be available to your backend processor. (This allows you to handle requests that must be processed within some time period.) And, while I’ve added my message to the queue as a string, there’s also an overload for the SendMessageAsync
method that accepts a BinaryData
object.
In my next post, I’m going to look at code for a backend server-side processor for reading messages your frontend has added to the queue.
Peter Vogel is both the author of the Coding Azure series and the instructor for Coding Azure in the Classroom. Peter’s company provides full-stack development from UX design through object modeling to database design. Peter holds multiple certifications in Azure administration, architecture, development and security and is a Microsoft Certified Trainer.