This template repository contains an Azure Functions reference sample using the Blob trigger with Event Grid source type, written in PowerShell 7.4 (isolated process mode) and deployed to Azure using the Azure Developer CLI (azd). When deployed to Azure the sample uses managed identity and a virtual network to make sure deployment is secure by default. You can control whether a VNet is used to secure storage in the sample by setting VNET_ENABLED to true or false in the AZD parameters.
This sample implements a simple function that copies PDF files from an unprocessed-pdf container to a processed-pdf container when new blobs are created using Azure PowerShell cmdlets. This straightforward example showcases how to use the Event Grid blob trigger to automatically respond to blob creation events in near real-time.
This sample uses the Event Grid source type for the Blob trigger, which provides significant advantages over the traditional scan-based approach:
- Near real-time processing: Event Grid delivers blob events within milliseconds of creation, eliminating the delays associated with container scanning.
- Improved scalability: Perfect for Flex Consumption plans where the traditional polling-based Blob trigger isn't available.
- Reduced costs: Eliminates storage transaction costs from polling, which can be substantial with large numbers of blobs.
- Enhanced reliability: Uses a robust pub/sub model that ensures blob events aren't missed, even during function downtime.
- Better performance: No performance degradation with large numbers of blobs in a container, unlike the scan-based approach.
The Event Grid approach is recommended for all new Blob trigger implementations, especially when using Flex Consumption plans where the traditional storage polling method isn't available.
- Azure Developer CLI
- PowerShell 7.4+
- Azure Functions Core Tools
- To use Visual Studio Code to run and debug locally:
- Azurite to emulate Azure Storage services when running locally
Optional for uploading blobs:
To initialize a project from this azd template, clone the GitHub template repository locally using the git clone command:
```bash
git clone https://github.com/Azure-Samples/functions-quickstart-powershell-azd-eventgrid-blob.git
cd functions-quickstart-powershell-azd-eventgrid-blob
```
You can also clone the repository from your own fork in GitHub.
Azure Functions uses Azurite to emulate Azure Storage services when running locally. If you haven't done so, install Azurite.
Create two containers in the local storage emulator called processed-pdf and unprocessed-pdf. Follow these steps:
- 
Ensure Azurite is running. For more details see Run Azurite 
- 
Use Azure Storage Explorer, or the VS Code Storage Extension to create the containers. Using Azure Storage Explorer: - Install Azure Storage Explorer
- Open Azure Storage Explorer.
- Connect to the local emulator by selecting Attach to a local emulator.
- Navigate to the Blob Containerssection.
- Right-click and select Create Blob Container.
- Name the containers processed-pdfandunprocessed-pdf.
 Using VS Code Storage Extension: - Install the VS Code Azure Storage extension
- Ensure Azurite is running
- Click on the Azure extension icon in VS Code
- Under Workspace, expandLocal Emulator
- Right click on Blob Containersand selectCreate Blob Container
- Name the containers processed-pdfandunprocessed-pdf
 
- 
Upload the PDF files from the datafolder to theunprocessed-pdfcontainer.Using Azure Storage Explorer: - Open Azure Storage Explorer.
- Navigate to the unprocessed-pdfcontainer.
- Click on "Upload" and select "Upload Folder" or "Upload Files."
- Choose the datafolder or the specific PDF files to upload.
- Confirm the upload to the unprocessed-pdfcontainer.
 Using VS Code Storage Extension: - Install the VS Code Azure Storage extension
- Ensure Azurite is running
- Click on the Azure extension icon in VS Code
- Under Workspace, expandLocal Emulator, expandBlob Containers
- Right click on unprocessed-pdfand selectOpen in Explorer
- Copy and paste all the pdf files from the datafolder to it
 
PowerShell Azure Functions on Flex Consumption plans don't support managed dependencies through requirements.psd1. You need to include the Azure PowerShell modules directly with your function app.
Prerequisites:
- PowerShell Core 7.x must be installed on your system
Before deploying to Azure, run this command from the project root:
cd src
pwsh ./install-modules.ps1This will download the required Azure PowerShell modules (Az.Storage, Az.Accounts) to the src/Modules folder, which will be included in your function app deployment package.
Note: This step is only required for Azure deployment, not for local development with Azurite.
Using the terminal
- 
From the srcfolder, run this command to start the Functions host locally:cd src func startUsing Visual Studio Code 
- 
Open the project folder in a new terminal. 
- 
Run the code .command to open the project in Visual Studio Code.
- 
In the command palette (F1), type Azurite: Start, which enables debugging without warnings.
- 
Press F5 to run in the debugger. Make a note of the localhostURL endpoints, including the port, which might not be7071.
Now that the storage emulator is running, has files on the unprocessed-pdf container, and our app is running, we can execute the ProcessBlobUpload function to simulate a new blob event.
- If you are using VS Code, Visual Studio, or other tooling that supports .http files, you can open the test.httpproject file, update the port on thelocalhostURL (if needed), and then click on Send Request to call the locally runningProcessBlobUploadfunction. This will trigger the function to process theBenefit_Options.pdffile. You can update the file name in the JSON to process other PDF files.
The function code for the ProcessBlobUpload endpoint is defined in ProcessBlobUpload/run.ps1. The function uses PowerShell 7.4 and is configured with a blob trigger that uses Event Grid as the source in ProcessBlobUpload/function.json.
```json
{
  "bindings": [
    {
      "name": "InputBlob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "unprocessed-pdf/{name}",
      "connection": "PDFProcessorSTORAGE",
      "source": "EventGrid"
    }
  ]
}
```
The PowerShell function processes the blob by:
- 
Getting the blob name and file size from the trigger metadata 
- 
Logging information about the blob being processed 
- 
Copying the blob to the processed container with a new name prefix using the Copy-ToProcessedContainerfunction
- 
Using Azure PowerShell Storage cmdlets with managed identity authentication for secure access # Copy to processed container - simple demonstration of an async operation Copy-ToProcessedContainer -BlobData $InputBlob -BlobName "processed_$blobName" Write-Host "PDF processing complete for $blobName" 
Now that you have triggered the function, use the Azure Storage Explorer, or the VS Code Storage Extension, to check that the processed-pdf container has the processed file in the local storage emulator.
Login to the Azure Developer CLI if you haven't already:
azd auth loginRun this command from the base folder to provision the function app and other required Azure Azure resources, and deploy your code:
azd upBy default, an Azure virtual network is created, a private endpoint is configured on the storage account, and VNet integration is configured on the function app so it can reach the storage account. You can opt-out of a VNet being used in the sample. To do so, use azd env to configure VNET_ENABLED to false before running azd up:
azd env set VNET_ENABLED false
azd upThe first time you run azd up you will be prompted to supply these required deployment parameters:
| Parameter | Description | 
|---|---|
| Environment name | An environment that's used to maintain a unique deployment context for your app. You won't be prompted if you created the local project using azd init. | 
| Azure subscription | Azure subscription in which your resources are created. | 
| Azure location | Azure region in which to create the resource group that contains the new Azure resources. You can double check what regions support Flex Consumption using the AZ CLI. | 
This azd up command will perform the following steps for you:
- Provision a new resource group and all resources needed for the sample
- Deploy the function app code to the function app
- Run the post-up script from the scriptsfolder to configure the event grid subscription to point to the new function app for any new blobs added to theunprocessed-pdfstorage container.
To test your deployed function, you can upload PDF files to the unprocessed-pdf container and verify they are automatically processed and copied to the processed-pdf container.
If you deployed with VNet enabled (the default), you'll need to add your local IP address to the storage account firewall before you can upload files:
- Get your public IP address by visiting whatismyipaddress.com or running:
curl ifconfig.me- Configure the storage account network access settings using Azure CLI:
# Get the storage account name from your deployment
STORAGE_ACCOUNT=$(azd env get-values | grep "AZURE_STORAGE_ACCOUNT_NAME" | cut -d'=' -f2 | tr -d '"')
# Enable public network access and set to selected networks
az storage account update --name $STORAGE_ACCOUNT --public-network-access Enabled --default-action Deny
# Add your IP to the firewall (replace YOUR_IP with your actual IP)
az storage account network-rule add --account-name $STORAGE_ACCOUNT --ip-address YOUR_IPAlternatively, you can configure this through the Azure portal:
- Navigate to your storage account
- Go to Security + networking > Networking
- Set Public network access to Enabled and Public network access scope to Enable from selected networks
- Add your local IP address to the IPv4 Addresses allowed list
- Using Azure CLI:
# Upload a test PDF file to the unprocessed-pdf container
STORAGE_ACCOUNT=$(azd env get-values | grep "AZURE_STORAGE_ACCOUNT_NAME" | cut -d'=' -f2 | tr -d '"')
az storage blob upload --account-name $STORAGE_ACCOUNT --container-name unprocessed-pdf --name test.pdf --file ./data/Benefit_Options.pdf --auth-mode login- Using Azure Storage Explorer:
- Connect to your Azure storage account
- Navigate to the unprocessed-pdfcontainer
- Upload PDF files from the datafolder
- Using Azure Portal:
- Go to your storage account in the Azure portal
- Select Containers > unprocessed-pdf
- Click Upload and select PDF files to upload
After uploading files, check that they were processed:
- Monitor function execution:
- In the Azure portal, go to your Function App
- In the Overview page click on Invocations and more for the ProcessBlobUpload function
- You should see invocation logs showing the blob processing. Note this could take a few minutes to reflect.
- Check processed files:
- Navigate to the processed-pdfcontainer in your storage account
- Verify that files appear with the processed_prefix
- For example, test.pdfshould becomeprocessed_test.pdf
The Event Grid blob trigger should process files within seconds of upload, demonstrating the near real-time capabilities of this approach.
You can run the azd up command as many times as you need to both provision your Azure resources and deploy code updates to your function app.
Note
Deployed code files are always overwritten by the latest deployment package.
When you're done working with your function app and related resources, you can use this command to delete the function app and its related resources from Azure and avoid incurring any further costs:
azd down