title | description | ms.topic | ms.custom | ms.date | zone_pivot_groups |
---|---|---|---|---|---|
Azure Functions Scenarios |
Identify key scenarios that use Azure Functions to provide serverless compute resources in aa Azure cloud-based topology. |
conceptual |
devx-track-extended-java, devx-track-js, devx-track-python |
05/15/2023 |
programming-languages-set-functions-lang-workers |
We often build systems to react to a series of critical events. Whether you're building a web API, responding to database changes, processing event streams or messages, Azure Functions can be used to implement them.
In many cases, a function integrates with an array of cloud services to provide feature-rich implementations. The following are a common (but by no means exhaustive) set of scenarios for Azure Functions.
Select your development language at the top of the article.
There are several ways to use functions to process files into or out of a blob storage container. To learn more about options for triggering on a blob container, see Working with blobs in the best practices documentation.
For example, in a retail solution, a partner system can submit product catalog information as files into blob storage. You can use a blob triggered function to validate, transform, and process the files into the main system as they're uploaded.
The following tutorials use an Event Grid trigger to process files in a blob container:
::: zone pivot="programming-language-csharp"
For example, using the blob trigger with an event subscription on blob containers:
[FunctionName("ProcessCatalogData")]
public static async Task Run([BlobTrigger("catalog-uploads/{name}", Source = BlobTriggerSource.EventGrid, Connection = "<NAMED_STORAGE_CONNECTION>")]Stream myCatalogData, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myCatalogData.Length} Bytes");
using (var reader = new StreamReader(myCatalogData))
{
var catalogEntry = await reader.ReadLineAsync();
while(catalogEntry !=null)
{
// Process the catalog entry
// ...
catalogEntry = await reader.ReadLineAsync();
}
}
}
- Upload and analyze a file with Azure Functions and Blob Storage
- Automate resizing uploaded images using Event Grid
- Trigger Azure Functions on blob containers using an event subscription ::: zone-end
::: zone pivot="programming-language-python"
::: zone pivot="programming-language-javascript"
- Upload and analyze a file with Azure Functions and Blob Storage
- Automate resizing uploaded images using Event Grid
- Trigger Azure Functions on blob containers using an event subscription ::: zone-end
::: zone pivot="programming-language-powershell"
::: zone pivot="programming-language-java"
So much telemetry is generated and collected from cloud applications, IoT devices, and networking devices. Azure Functions can process that data in near real-time as the hot path, then store it in Azure Cosmos DB for use in an analytics dashboard.
Your functions can also use low-latency event triggers, like Event Grid, and real-time outputs like SignalR to process data in near-real-time.
::: zone pivot="programming-language-csharp"
For example, using the event hubs trigger to read from an event hub and the output binding to write to an event hub after debatching and transforming the events:
[FunctionName("ProcessorFunction")]
public static async Task Run(
[EventHubTrigger(
"%Input_EH_Name%",
Connection = "InputEventHubConnectionString",
ConsumerGroup = "%Input_EH_ConsumerGroup%")] EventData[] inputMessages,
[EventHub(
"%Output_EH_Name%",
Connection = "OutputEventHubConnectionString")] IAsyncCollector<SensorDataRecord> outputMessages,
PartitionContext partitionContext,
ILogger log)
{
var debatcher = new Debatcher(log);
var debatchedMessages = await debatcher.Debatch(inputMessages, partitionContext.PartitionId);
var xformer = new Transformer(log);
await xformer.Transform(debatchedMessages, partitionContext.PartitionId, outputMessages);
}
- Streaming at scale with Azure Event Hubs, Functions and Azure SQL
- Streaming at scale with Azure Event Hubs, Functions and Cosmos DB
- Streaming at scale with Azure Event Hubs with Kafka producer, Functions with Kafka trigger and Cosmos DB
- Streaming at scale with Azure IoT Hub, Functions and Azure SQL
- Azure Event Hubs trigger for Azure Functions
- Apache Kafka trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-python"
::: zone pivot="programming-language-javascript"
::: zone pivot="programming-language-powershell"
::: zone pivot="programming-language-java"
- Azure Functions Kafka trigger Java Sample
- Azure Event Hubs trigger for Azure Functions
- Apache Kafka trigger for Azure Functions ::: zone-end
Besides data processing, Azure Functions can be used to infer on models.
For example, a function that calls a TensorFlow model or submits it to Azure AI services can process and classify a stream of images.
Functions can also connect to other services to help process data and perform other AI-related tasks, like text summarization.
::: zone pivot="programming-language-csharp"
- Sample: Text summarization using AI Cognitive Language Service ::: zone-end
::: zone pivot="programming-language-javascript"
- Training: Create a custom skill for Azure AI Search
- Sample: Chat using ChatGPT ::: zone-end
::: zone pivot="programming-language-python"
- Tutorial: Apply machine learning models in Azure Functions with Python and TensorFlow
- Tutorial: Deploy a pretrained image classification model to Azure Functions with PyTorch
- Sample: Chat using ChatGPT
- Sample: LangChain with Azure OpenAI and ChatGPT ::: zone-end
Functions enables you to run your code based on a cron schedule that you define.
Check out how to Create a function in the Azure portal that runs on a schedule.
A financial services customer database, for example, might be analyzed for duplicate entries every 15 minutes to avoid multiple communications going out to the same customer.
::: zone pivot="programming-language-csharp"
[FunctionName("TimerTriggerCSharp")]
public static void Run([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, ILogger log)
{
if (myTimer.IsPastDue)
{
log.LogInformation("Timer is running late!");
}
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
// Perform the database deduplication
}
- Timer trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-python"
- Timer trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-javascript"
- Timer trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-powershell"
- Timer trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-java"
- Timer trigger for Azure Functions ::: zone-end
An HTTP triggered function defines an HTTP endpoint. These endpoints run function code that can connect to other services directly or by using binding extensions. You can compose the endpoints into a web-based API.
You can also use an HTTP triggered function endpoint as a webhook integration, such as GitHub webhooks. In this way, you can create functions that process data from GitHub events. To learn more, see Monitor GitHub events by using a webhook with Azure Functions.
For examples, see the following: ::: zone pivot="programming-language-csharp"
[FunctionName("InsertName")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req,
[CosmosDB(
databaseName: "my-database",
collectionName: "my-container",
ConnectionStringSetting = "CosmosDbConnectionString")]IAsyncCollector<dynamic> documentsOut,
ILogger log)
{
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
string name = data?.name;
if (name == null)
{
return new BadRequestObjectResult("Please pass a name in the request body json");
}
// Add a JSON document to the output container.
await documentsOut.AddAsync(new
{
// create a random ID
id = System.Guid.NewGuid().ToString(),
name = name
});
return new OkResult();
}
- Article: Create serverless APIs in Visual Studio using Azure Functions and API Management integration
- Training: Expose multiple function apps as a consistent API by using Azure API Management
- Sample: Web application with a C# API and Azure SQL DB on Static Web Apps and Functions
- Azure Functions HTTP trigger ::: zone-end
::: zone pivot="programming-language-python"
- Azure Functions HTTP trigger ::: zone-end
::: zone pivot="programming-language-javascript"
- Azure Functions HTTP trigger ::: zone-end
::: zone pivot="programming-language-powershell"
- Azure Functions HTTP trigger ::: zone-end
::: zone pivot="programming-language-java"
- Training: Develop Java serverless Functions on Azure using Maven
- Azure Functions HTTP trigger ::: zone-end
Functions is often the compute component in a serverless workflow topology, such as a Logic Apps workflow. You can also create long-running orchestrations using the Durable Functions extension. For more information, see Durable Functions overview.
::: zone pivot="programming-language-csharp"
- Tutorial: Create a function to integrate with Azure Logic Apps
- Quickstart: Create your first durable function in Azure using C#
- Training: Deploy serverless APIs with Azure Functions, Logic Apps, and Azure SQL Database ::: zone-end
::: zone pivot="programming-language-javascript"
- Quickstart: Create your first durable function in Azure using JavaScript
- Training: Deploy serverless APIs with Azure Functions, Logic Apps, and Azure SQL Database ::: zone-end
::: zone pivot="programming-language-python"
- Quickstart: Create your first durable function in Azure using Python
- Training: Deploy serverless APIs with Azure Functions, Logic Apps, and Azure SQL Database ::: zone-end
::: zone pivot="programming-language-java"
- Quickstart: Create your first durable function in Azure using Java ::: zone-end
::: zone pivot="programming-language-powershell"
- Quickstart: Create your first durable function in Azure using PowerShell ::: zone-end
There are processes where you might need to log, audit, or perform some other operation when stored data changes. Functions triggers provide a good way to get notified of data changes to initial such an operation.
Consider the following examples:
::: zone pivot="programming-language-csharp"
- Article: Connect Azure Functions to Azure Cosmos DB using Visual Studio Code
- Article: Connect Azure Functions to Azure SQL Database using Visual Studio Code
- Article: Use Azure Functions to clean-up an Azure SQL Database ::: zone-end
::: zone pivot="programming-language-javascript"
- Article: Connect Azure Functions to Azure Cosmos DB using Visual Studio Code
- Article: Connect Azure Functions to Azure SQL Database using Visual Studio Code ::: zone-end
::: zone pivot="programming-language-python"
- Article: Connect Azure Functions to Azure Cosmos DB using Visual Studio Code
- Article: Connect Azure Functions to Azure SQL Database using Visual Studio Code ::: zone-end
You can use Functions with Azure messaging services to create advanced event-driven messaging solutions.
For example, you can use triggers on Azure Storage queues as a way to chain together a series of function executions. Or use service bus queues and triggers for an online ordering system.
The following article shows how to write output to a storage queue.
::: zone pivot="programming-language-csharp"
- Article: Connect Azure Functions to Azure Storage using Visual Studio Code
- Article: Create a function triggered by Azure Queue storage (Azure portal) ::: zone-end
::: zone pivot="programming-language-javascript"
- Article: Connect Azure Functions to Azure Storage using Visual Studio Code
- Article: Create a function triggered by Azure Queue storage (Azure portal)
- Training: Chain Azure Functions together using input and output bindings ::: zone-end
::: zone pivot="programming-language-python"
- Article: Connect Azure Functions to Azure Storage using Visual Studio Code
- Article: Create a function triggered by Azure Queue storage (Azure portal) ::: zone-end
::: zone pivot="programming-language-java"
- Article: Connect Azure Functions to Azure Storage using Visual Studio Code
- Article: Create a function triggered by Azure Queue storage (Azure portal) ::: zone-end
::: zone pivot="programming-language-powershell"
- Article: Connect Azure Functions to Azure Storage using Visual Studio Code
- Article: Create a function triggered by Azure Queue storage (Azure portal)
- Training: Chain Azure Functions together using input and output bindings ::: zone-end
And these articles show how to trigger from an Azure Service Bus queue or topic.
::: zone pivot="programming-language-csharp"
- Azure Service Bus trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-javascript"
- Azure Service Bus trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-python"
- Azure Service Bus trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-java"
- Azure Service Bus trigger for Azure Functions ::: zone-end
::: zone pivot="programming-language-powershell"
- Azure Service Bus trigger for Azure Functions ::: zone-end
[!div class="nextstepaction"] Getting started with Azure Functions