I'm trying to recreate something I did in AWS using Cognito User Pool and Identity Pool. The user was able to login and receive temporary tokens that allowed direct access to an s3 bucket. See here for more info on that. I would like my B2C users to be able to login to my SPA and list containers and blobs and get blobs. I've successfully implemented logging in using MSAL (@azure/msal-browser) with auth flow, but I cannot figure out how to provide access tokens for the storage account (or ANY azure resource for that matter). I've run around in circles in the documentation for days, so if you link a docs page, I'd appreciate some elaboration because I'm obviously not understanding something.
Accessing Storage is not supported with token obtained using B2C user flow or custom policy Reference: As u not able to create Storage account in your azure ad b2c tenant .you need to create storage in azure and You need to add the user in your B2C AAD to your current ADD as the guest to access the blob storage .
For example :the email of my B2C user is [email protected].
And for the operation of data, the user need this role:
For more details refer this SO Thread
I have the same issue. Not being able to deal with it, i had to resolve it going a different way. My SPA app first makes a call to the API and the API that has full permissions creates a container and a SAS token for that container with create/write access for 5 mins. The api then returns the full URI of the blob to be created along with the SAS token to the UI. Then the UI uses that URI to make an authorized call to create a new blob in that container.
Here is the POC code i used to validate that this solution is working.
1st: the code for creating the full blob URI containing the SAS token:
using Azure.Storage.Blobs;
using Azure.Storage;
using Azure.Storage.Sas;
Guid blobId = Guid.NewGuid();
string blobName = blobId.ToString();
string containerName = _requestStateService.CurrentUserId.ToString();
var blobServiceClient = new BlobServiceClient("DefaultEndpointsProtocol=https;AccountName=xxxxxxxx;AccountKey=yyyyyy;EndpointSuffix=core.windows.net");
IDictionary<string, string> containerMetadata = new Dictionary<string, string>();
containerMetadata["test"] = "test meta";
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
if (!await containerClient.ExistsAsync())
{
containerClient = await blobServiceClient.CreateBlobContainerAsync(
containerName, metadata: containerMetadata);
}
BlobSasBuilder containerSasBuilder = new BlobSasBuilder()
{
BlobContainerName = containerName,
Resource = "c",
ExpiresOn = DateTime.UtcNow.AddMinutes(5),
};
containerSasBuilder.SetPermissions(BlobSasPermissions.Write | Azure.Storage.Sas.BlobSasPermissions.Create | BlobSasPermissions.Tag);
var containerSASToken = containerClient.GenerateSasUri(containerSasBuilder).AbsoluteUri.Split('?')[1].ToString();
var containerSASToken = await CreateContainerSASToken(containerClient, containerName);
var blobSASURI = containerClient.Uri.AbsoluteUri + "/" + blobName + "?" + containerSASToken;
return blobSASURI;
2nd: the code that this URI can be used by another app like the UI to upload the blob:
var client = _clientFactory.CreateClient();
var content = new StringContent("Write new file");
content.Headers.Add("x-ms-version", "2020-04-08");
content.Headers.Add("x-ms-blob-type", "BlockBlob");
content.Headers.Add("x-ms-meta-test1", "test meta 1");
content.Headers.Add("x-ms-meta-test2", "test meta 2");
content.Headers.Add("x-ms-tags", "tag1=aaa&tag2=bbb");
var response = await client.PutAsync(blobSASURI, content);
response.EnsureSuccessStatusCode();
For the container i chose to have one per user where i put all the blobs of a user.
To read the blobs i can create SAS tokens every time they need to be fetched. I know this is not the most simple solution but this it meets my needs for authorization since the API needs to be called first and it can check and authorize the user and what blobs he should be able to access.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With