1

I'm testing an Azure function that has a blob trigger then writes a message in a message queue. I have "Allow storage account key access" disabled as we want to use Managed identities to access all resources. Problem is that when the code below is triggered gives an error:

2024-12-12T01:00:08Z   [Verbose]   Host instance '5xxxxxxxx' failed to acquire host lock lease: Azure.Storage.Blobs: Service request failed.
    Status: 403 (Key based authentication is not permitted on this storage account.)
    ErrorCode: KeyBasedAuthenticationNotPermitted"

Permisisons are given to the function MI on the storage account as Storage Blob data contributor and Storage Queue data contributor. The code is as fololos, from a sample provided by MSFT.:

    import logging
    import json
    import base64
    import time
    import azure.functions as func
    from azure.identity import DefaultAzureCredential
    from azure.storage.queue import QueueServiceClient
    
    app = func.FunctionApp()
    
    @app.blob_trigger(arg_name="myblob", path="ccr1/{name}",
                                   connection="AzureWebJobsStorage") 
    def blob_trigger(myblob: func.InputStream):
        logging.info(f"Python blob trigger function processed blob"
                    f"Name: {myblob.name}"
                    f"Blob Size: {myblob.length} bytes")
        try:
            message_data = {
                "file_path": myblob.name
            }
            blob_name = myblob.name.split('/')[-1] 
            if blob_name.endswith('.pdf'):
                logging.info(f"Extracted PDF name: {blob_name}")
                ccr_e2e_report(blob_name)  # Send the PDF name to another function
            else:
                logging.warning("The blob is not a PDF file.")
        except Exception as e:
            logging.info(e)
    
    # Example function to send data to
    def ccr_e2e_report(pdf_name: str):
        logging.info(f"Generating CAR report for : {pdf_name}")
        time.sleep(20)
        logging.info(f"Report generation complete: {pdf_name}")
    
    
    @app.route(route="http_trigger", auth_level=func.AuthLevel.ANONYMOUS)
    def http_trigger(req: func.HttpRequest) -> func.HttpResponse:
        logging.info('Python HTTP trigger function processed a request.')
    
        cnt = int(req.params.get('count'))
    
        if cnt:
            for number in range(1, cnt + 1):
                message_data = {
                    "file_path": "None",
                    "file_uri": "None",
                    "file_metadata": f"{cnt}"
                }
                
                # Convert the Python dictionary to a JSON string
                message_json = json.dumps(message_data)
                        
                send_message_to_queue("ccrqueue", message_json)
    
            return func.HttpResponse(f"Submitted: {cnt} Messages. This HTTP triggered function executed successfully.")
        else:
            return func.HttpResponse(
                 "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized resonse.",
                 status_code=200
            )
    

All my research tells me they support MI access to storage accounts from a function app but perhapsIm missing something obvious here, any info would eb great. Thanks

1
  • Update AzureWebJobsStorage to use Managed Identity, add "centralizedManagedIdentity": true in host.json, and ensure the Function App's MI has Blob Data Contributor role. Commented Dec 12, 2024 at 3:32

1 Answer 1

2

I have disabled the Allow storage account key access option in the Storage Account under Configuration, as shown below.

enter image description here

We should create a Service Principal in Azure AD and add the Client ID, Client Secret, and Tenant ID to the system environment variables to enable the function to use DefaultAzureCredentials, as shown below.

enter image description here

Add the Service Principal details to the System Environment Variables.

AZURE_CLIENT_ID = <clientID>
AZURE_CLIENT_SECRET = <clientSecret>
AZURE_TENANT_ID = <TenantID>

enter image description here

I have updated your code to send a message to the queue and upload a blob locally using DefaultAzureCredential. I assigned the Storage Queue Data Contributor and Storage Blob Data Contributor roles to both the Service Principal and the User in Azure Storage.

enter image description here

function_app.py :

import logging
import os
import json
import time
import azure.functions as func
from azure.identity import DefaultAzureCredential
from azure.storage.queue import QueueServiceClient

app = func.FunctionApp()

def send_message_to_queue(queue_name: str, message: str):
    credential = DefaultAzureCredential()
    queue_service_uri = os.getenv("BlobConnec__queueServiceUri")
    if not queue_service_uri:
        raise ValueError("Queue service URI not found.")

    queue_service = QueueServiceClient(account_url=queue_service_uri, credential=credential)
    queue_client = queue_service.get_queue_client(queue_name)
    queue_client.send_message(message)
    logging.info(f"Message sent to queue '{queue_name}': {message}")

@app.blob_trigger(arg_name="myblob", path="ccr1/{name}", connection="BlobConnec")
def blob_trigger(myblob: func.InputStream):
    logging.info(f"Python blob trigger function processed blob\n"
                 f"Name: {myblob.name}\n"
                 f"Blob Size: {myblob.length} bytes")
    try:
        blob_name = myblob.name.split('/')[-1]
        if blob_name.endswith('.pdf'):
            logging.info(f"Extracted PDF name: {blob_name}")
            ccr_e2e_report(blob_name)  
        else:
            logging.warning("The blob is not a PDF file.")
    except Exception as e:
        logging.error(f"Error processing blob: {e}")

def ccr_e2e_report(pdf_name: str):
    logging.info(f"Generating CAR report for: {pdf_name}")
    time.sleep(20) 
    logging.info(f"Report generation complete: {pdf_name}")

@app.route(route="http_trigger", auth_level=func.AuthLevel.ANONYMOUS)
def http_trigger(req: func.HttpRequest) -> func.HttpResponse:
    logging.info('Python HTTP trigger function processed a request.')
    cnt = int(req.params.get('count', 0))
    if cnt > 0:
        for number in range(1, cnt + 1):
            message_data = {
                "file_path": "None",
                "file_uri": "None",
                "file_metadata": f"{number}"
            }
            message_json = json.dumps(message_data)
            send_message_to_queue("ccrqueue", message_json)
        return func.HttpResponse(f"Submitted: {cnt} Messages. This HTTP triggered function executed successfully.")
    else:
        return func.HttpResponse(
            "This HTTP triggered function executed successfully. Pass a count in the query string or in the request body.",
            status_code=200
        )

local.settings.json :

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "python",
    "BlobConnec__blobServiceUri": "https://<storageName>.blob.core.windows.net",
    "BlobConnec__queueServiceUri": "https://<storageName>.queue.core.windows.net"
  }
}

Browser Output :

I ran the function and sent GET requests for the HTTP trigger function in the browser.

enter image description here

I sent a message to Azure Queue Storage as shown below:

enter image description here

I successfully received the message in the Azure Queue Storage using Microsoft Entra user account authentication.

enter image description here

I uploaded a PDF file to Blob Storage using Microsoft Entra user account authentication.

enter image description here

Terminal Output:

The HTTP trigger and Blob trigger functions ran successfully, as shown below.

enter image description here enter image description here

Sign up to request clarification or add additional context in comments.

4 Comments

thanks for the code, look street, One question, did you have "Allow storage account key access" disabled on your storage account?
@mac I updated the answer. Please check it.
Unfortunately, when I deploy and set the "Allow storage account key access" to disabled on the storage account, the Azure function stops working. These are a few of the pages I followed: techcommunity.microsoft.com/blog/healthcareandlifesciencesblog/… devblogs.microsoft.com/premier-developer/…

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.