Azure Functions is the primary function-as-a-service (FaaS) offering in the Azure suite that allows for the creation of serverless applications. The popularity of such applications is that, although servers still exist in serverless infrastructure, end users do not have to handle the complex task of managing them.
Initially, Azure Functions were designed for ephemeral, stateless operations, making them ideal for short-lived tasks. However, this limitation posed challenges for scenarios requiring long-running, stateful processes. To address this, Azure Durable Functions were introduced, bringing state management and task orchestration capabilities to the serverless environment.
This post will discuss the use of Azure Functions, the need for Azure Durable Functions in stateful applications, and their different types. We will also explore advanced workflows with durable functions and how to properly manage and monitor them.
With Azure Functions, you can write stateless, event-driven applications with less code in any supported language such as Java, Python, PowerShell, JavaScript, and TypeScript.
In stateless applications, no user information is retained between requests and responses. Consequently, all necessary data to execute an action must be included within the request payload. For example, consider an Azure Function that is triggered by a timer to execute ETL operations every hour. In a stateless architecture, you have to pass the previous database state within each request to ensure the function has all the necessary context to perform the ETL process accurately.
However, you may need to retain context and history internally, eliminating the need to pass the previous state with each request. This is called stateful architecture, and for that, Azure Durable Functions offer a way to orchestrate long-running stateful workflows in serverless environments.
State management is important in asynchronous workflows, as it ensures consistency across long-running processes. In scenarios like this, where tasks are executed over extended periods or require multiple steps, retaining the state allows the workflow to resume seamlessly from the last known point.
Additional use cases for stateful architecture include:
Additionally, stateful architecture facilitates error handling, improves performance, and enhances overall workflow reliability and efficiency.
Stateful solutions in Azure consist of multiple types of Durable Functions, each playing a different role. There are four main categories, namely client, orchestrator, activity, and entity functions.
The client function is the starting point of any Azure Durable Functions solution and can be any function used to trigger the orchestrator function or the entity function. There are an array of trigger templates in Azure Functions. For example, an HTTP trigger function, a timer trigger function, or even an Azure Blob Storage trigger could be the client function.
In the example above, we are using Python to create a timer trigger client function locally in VS Code.
To run Azure Durable Functions locally, there are various prerequisites; there are also two versions of Azure Durable Functions in Python. We will use V2 because it simplifies the file structure using a decorator-based approach; you also will not have to maintain the “functions.json” files separately:
# Timer-Triggered Function with a Durable Functions Client binding
@myApp.timer_trigger(arg_name="timer", schedule="*/30 * * * * *")
# Cron job that runs every 30 seconds
@myApp.durable_client_input(client_name="client")
async def timer_start(timer: func.TimerRequest, client):
# Start a new orchestration of the orchestrator function
instance_id = await client.start_new("weather_orchestrator")
# Log instance ID of the started orchestration
logging.info(f"Started orchestration with ID = '{instance_id}'.")
This timer-triggered function runs every 30 seconds, as specified by the cron expression in the @myApp.timer_trigger decorator. It uses the Durable Functions client, configured with the @myApp.durable_client_input decorator, to start a new orchestration instance of the weather_orchestrator function.
The function logs the unique instance ID of each orchestration started, providing a way to track and manage orchestration instances. This setup allows the function to periodically initiate workflows defined in the weather_orchestrator function.
Orchestrator functions are long-running functions responsible for managing the sequencing and coordination of other functions in a Durable Functions workflow. They can manage various activities, including interacting with activity functions, sub-orchestrations, error handling, durable timers, external events, and even entity functions.
To continue the above example, we can create an orchestration function, weather_orchestrator, that will be triggered by the timer_trigger function every 30 seconds:
# Orchestrator
@myApp.orchestration_trigger(context_name="context")
# Orchestrator function to manage parallel task execution
def weather_orchestrator(context):
# List of cities for which to fetch weather data
cities = ["Seattle", "Tokyo", "London"]
tasks = []
for city in cities:
# Create a task for each city to fetch its weather using an activity function
tasks.append(context.call_activity("fetch_weather", city))
results = yield context.task_all(tasks)
# Aggregate results from all the tasks
return results
This orchestration function manages parallel calls to the fetch_weather activity function for multiple tasks. It waits for all tasks to be completed and returns their results, demonstrating a coordinated, concurrent execution workflow within Azure Durable Functions.
In real-world scenarios, there would likely be multiple activity functions performing various tasks such as data processing, external API calls, and other business logic operations.
Activity functions consist of various business logic operations or network calls within a Durable Functions workflow. For example, if we want to get the live weather of multiple cities through an external API, we could write the business logic in an activity function and return the values to the orchestrator. If you have multiple business logic operations, you can write multiple activity functions.
We can create an activity function for the weather example above, which will simulate retrieving weather information through an API call and returning it to the orchestrator:
# Activity
@myApp.activity_trigger(input_name="city")
# Define an activity function triggered by city-input
def fetch_weather(city: str):
# Predefined weather conditions for each city
weather_data = {
"Seattle": "Rainy",
"Tokyo": "Sunny",
"London": "Cloudy"
}
result = {city: weather_data.get(city, "Unknown")}
# Retrieve the weather condition for the given city, default to "Unknown" if the city is not on the dictionary
logging.info(f"Fetched weather for {city}: {result[city]}")
# Log retrieved weather conditions
return result
In this example, the fetch_weather activity function simulates fetching weather data for a given city. It contains a dictionary with predefined weather conditions for Seattle, Tokyo, and London. When called, it retrieves the weather conditions for the specified city from the dictionary and logs the fetched data.
Note: If you test the above example on a local machine, you may need an Azure Storage Emulator. A storage account is essential because durable functions require storage to manage orchestration state, track activity functions, and handle retries and checkpoints. The open-source Azurite storage emulator is good option here.
Orchestration functions implicitly store state; however, it will be tied into the execution flow. In Durable Functions 2.0, entity functions were introduced to explicitly store and modify state in a reliable and scalable manner. These entity functions can be invoked through either the client function or the orchestrator function.
For example, imagine you want to maintain a running average of temperature data collected from various cities. Every time new data is collected, the orchestrator function updates the running average for each city by invoking the entity function. The entity function updates the count and cumulative sum of temperatures with each new data point, recalculates the average, and stores the updated state. This ensures data consistency and durability, facilitating concurrent updates and scalability.
When you are implementing reliable and scalable long-running processes, you can use workflow patterns depending on the scenario. For instance, you might need to orchestrate a sequence of tasks where each step relies on the output of the previous one; for that, you can use the function chaining pattern.
In another scenario, you may require high-performance data processing that involves executing multiple tasks in parallel and then consolidating their results; here, the fan-out/fan-in pattern is ideal.
Other patterns available include async HTTP APIs, monitoring, human interaction, and aggregator (stateful entities), each designed to help developers build robust, efficient, and maintainable serverless applications on the Azure platform.
Despite the clear benefits that Durable Functions bring to managing state in applications, they do have their drawbacks. Developers must consider these carefully before implementing Durable Functions to make sure they're not sacrificing scalability and performance.
The following are factors to keep in mind.
In serverless environments, applications that haven't been started recently take longer to start up. This delay occurs because Azure needs to handle tasks such as allocating servers for the function app, starting the function runtime on the server, and executing the function code—all of which increases the latency for functions that haven’t been called recently.
Notably, Azure has measures, such as pre-warmed servers, to reduce this latency. Additionally, with careful design of your application architecture, you can further mitigate these delays.
Services that facilitate Durable Functions, such as App Insights and storage accounts, often cost more than the actual function itself.
For example, App Insights offers additional analysis on application logs, with pricing based on the volume of logs ingested. Storage accounts are primarily used for state management and orchestration tracking, with prices often increasing because each time the orchestrator function starts, executes actions, and completes, it logs these events to a storage account.
You will need to properly manage such fees by removing unnecessary operations in the code, disabling App Insights, and analyzing Azure Cost Management more closely.
While Durable Functions scale automatically, the rate of scaling might not be as rapid as stateless functions due to their dependency on storage operations. Heavy usage can also lead to increased latency due to rate limits in Azure Storage.
To effectively manage these limitations while still getting the most out of Durable Functions, you will need to monitor and strategically manage your architectural decisions.
Imagine your function app is in production and being enjoyed by thousands. Suddenly, while you are away from work, users begin complaining about unexpected crashes, sluggish performance, or complete unavailability of the application. By the time these complaints reach you, the business might have already suffered from lost revenue and diminished trust.
User experience depends on multiple components, such as application health, infrastructure health, and resource utilization. Having clean code and a robust CI/CD setup is great; however, making sure an application performs well in production is critical.
You can leverage various tools to monitor and report on your stateful application’s health by integrating Azure Monitor and Application Insights with Azure Functions. Here's how to manage and monitor Azure Durable Functions:
However, monitoring complexity increases when workloads span across multiple orchestrators. There are external solutions like ManageEngine Site24x7, Datadog, New Relic, and Dynatrace that address this issue of observability silos. These tools provide enhanced monitoring and management capabilities, such as real-time insights, synthetic monitoring, and detailed analytics, which can help you maintain optimal performance and quickly address any issues that arise.
By leveraging both Azure's integrated tools and external solutions, you can maintain the health and performance of your Durable Functions app, providing a better experience for your users and safeguarding your business's reputation.
Azure Durable Functions provide an advanced framework for orchestrating stateful, long-running workflows in serverless environments.
We have explored the various types of durable functions and their applications in managing complex asynchronous processes. We've also highlighted the importance of state management and the implementation of advanced workflow patterns to enhance application reliability, efficiency, and scalability.
Azure additionally offers Azure Monitor and Application Insights to help guarantee your application’s health and performance. However, for enhanced monitoring and management capabilities, tools such as ManageEngine Site24x7 provide real-time insights, synthetic monitoring, and detailed analytics.
By utilizing Site24x7, you can proactively address potential issues, maintain application health, and deliver exceptional user experiences.
Discover how Site24x7 can elevate your monitoring strategy and ensure the success of your Azure Durable Functions deployments. Start your 30-day free trial today.
Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 “Learn” portal. Get paid for your writing.
Apply Now