Azure Durable Functions: FaaS for Complex Workflows
Function as a Service (FaaS) helps developers work better by offering a fast, scalable way to execute a task in response to an event without the need to manage infrastructure. A function transforms input to output, making it a lightweight choice for scheduled tasks, reminders, order processing and other actions that don’t require fault tolerance more complex than simple retries. Plus, functions run on a cloud host’s infrastructure, scaling in and out on demand, and you only pay for the resources you consume.
But what if your workflow needs more reliability and fault tolerance than a short-lived function provides? What if your scenario includes longer-running tasks and the workflow needs the output of one function as the input to another?
That’s when you need tools designed to orchestrate higher-level workflows. However, you don’t have to give up the ease of FaaS.
Traditionally when developers orchestrated complex workflows, they used databases and queues to help track application states. Whether using a traditional infrastructure or cloud services, this approach requires you to write, test and manage scalable logic to check point and manage states, handle retries and recover from failures.
Message queues offer a level of durability by decoupling downstream users from the source data. But managing this extra infrastructure — and the intricate logic around it to achieve fault tolerance — adds time and complexity. By comparison, you can use Azure Durable Functions to take care of the details for you, including state tracking, automatic retries, recovery from failure and load balancing tasks. You can focus on the business logic while getting the added benefits of a managed infrastructure, automatic scaling and pay-per-use pricing.
Durable Functions gives you the best of both worlds by extending the FaaS benefits of the Azure Functions platform. In essence, you can write code to build complex, stateful workflows in a serverless compute environment. You can power incredibly sophisticated stateful scenarios without the overhead of traditional development models.
Complex Stateful Executions Simplified
Durable Functions can retain state between function calls. That means you can use Durable Functions to power software patterns that single-purpose functions aren’t designed for. For example, you might want to design an application to wait for human interaction or use parallel processing to improve concurrency and speed processing time.
Fan-out/fan-in is a common pattern, where multiple functions can execute in parallel, fanning out across multiple machines (in addition to the one used by your app) and increasing throughput. Then they fan in, combining the results.
Single-purpose functions can fan out if you have them send multiple messages to a queue. But to fan in, you have to write code to track when the queue-triggered functions end and then store function outputs. It starts to get complicated. You can do this more easily with Durable Functions by using relatively simple code.
For example, the following code concisely implements the fan-out/fan-in pattern.
Code isn’t the only way to connect the dots. Microsoft offers a powerful low-code orchestrator in Azure Logic Apps, which features an extensive library of prebuilt connectors for integration with other services.
But the advantages of a code-centric solution like Durable Functions are the added control and customization. With some C#, JavaScript, TypeScript, Python, Java or PowerShell code, you can define and execute more intricate workflows and get precise control over concurrency, execution error handling, and more
Components of a Durable Functions App
The basic units of work in Durable Functions are the activity functions. These short-lived, stateless functions perform a single task, but you can orchestrate them into more complex processes. For example, an e-commerce site can use an activity function to check inventory, another to charge a customer and another to create a shipment. But to process an entire order, you need a high-level workflow.
That’s the role of the orchestrator function. As the name implies, orchestrator functions describe how tasks are executed and in which order. They work like stateful glue, connecting activities and even other orchestrator functions (sub-orchestrations). The state of the orchestrator is maintained implicitly by the underlying framework of Durable Functions, which tracks the pending and completed tasks, and intermediate results. The platform maintaining orchestrator states mean that even if a process crashes, you can restart from the point of failure instead of the beginning. Needless to say, this is more efficient time- and cost-wise.
Orchestrators can also wait and listen for external events, making them useful for handling human interaction or other external triggers. In exchange for following a few coding constraints, you can use orchestrator functions to run powerful workflows.
Serverless Stateful Objects: The Entity Function
Durable Functions gives you another elegant way to introduce statefulness to your application, the entity function. You can easily represent stateful objects in a distributed environment by using entity functions. Like objects, each entity instance has a unique identifier that can be used to read and manipulate its internal state explicitly. Entities define a list of operations that constrain the management of their internal state, like an object interface.
For example, entity functions can represent a counter or user object. The following code shows the implementation of a simple Counter
entity. There are three operations (add
, reset
, and get
) that can be used to manipulate the state of the Counter
entity. In a product like a smartwatch, you could have many instances of these Counter
entities, each one storing a particular user’s state, such as step count.
Durable Entities make managing state easy and intuitive. As another example, you can use Durable Entities to persist and manage the state of Internet of Things (IoT) devices. For more scenarios, see a walkthrough of Durable Entities and serverless circuit breakers with Durable Entities.
When to Use Durable Functions
You can mix and match function types to create sophisticated solutions, such as data processing pipelines; high-scale extract, transform, load (ETL); business workflows; asynchronous APIs and batch jobs. For sample architectures, see:
- Automate document identification, classification and search: In this machine-learning scenario, Durable Functions is used to manage document ingestion and workflow orchestration.
- Implement a banking cloud transformation: An orchestration-based saga pattern handles a bank’s distributed transactions using microservices. Durable orchestrator functions provide the workflow programming model and state management.
- Automate infrastructure reconfiguration: This example uses functions to automatically rotate private IPs when a firewall is in front of container instances. Configuration status is a long-running process, so Durable Functions is used to get status updates.
Next-Level Function as a Service
FaaS has been a game-changer for programmers, and Durable Functions gives you even more control over the development and management of your serverless workflows.
As part of the Azure Functions platform, Durable Functions delivers high-performance scaling and Azure-native integration, plus the power to express your workflows as code. You can run activities or sub-orchestrations in parallel, configure automatic retries with backoff for greater resilience, easily set timeouts for workflows or cancel them altogether — all while having the states of workflows automatically managed for you by Durable Functions
Together with the scalability of the Azure Functions platform, you can get end-to-end workflows up and running quickly. To see for yourself, try these examples:
- Create your first Durable Functions.
- Fan-out/fan-in scenario in Durable Functions: Cloud backup example.