In cloud-native development, leveraging Azure Durable Functions and Durable Entities provide an new and interesting way to build robust, scalable workflows. Moving away from traditional development complexities, this "code-first" approach enhances productivity, streamlines onboarding, and improves code quality. In this article, we'll explore how these tools can transform your business operations, making development faster and more efficient.
Traditional approaches vs. modern cloud-native development
Traditionally, building business processes involves a highly database-centric approach. Development typically begins with a comprehensive database design, setting up schemas and tables before moving on to create classes and properties. These classes would then map to the database schema using Object-Relational Mappers (ORMs) like Entity Framework, NHibernate or Dapper. This method, while effective, often leads to a tightly coupled system where the database structure heavily influenced the application design.
Developers have to manually implement dehydration and hydration logic to save and restore object states. Even when using an ORM, this means writing a lot of boilerplate code to ensure that the application could persist and reload states, adding complexity and increasing the likelihood of errors. The state has to be carefully managed to ensure data consistency, especially in long-running processes involving multiple steps and decision points.
Another tried and tested way of implementing business processes would be to use some integration and orchestration engine, like Microsoft BizTalk Server or Mule ESB, where you would design your business process using a "point and click" approach. For each step or transition in the process, you would then hook up the necessary code or use pre-made plugins to perform the actual work.
In contrast, the modern approach with Azure Durable Functions and Durable Entities represents a shift in mindset. Developers can focus on writing business logic using a code-first approach. This method abstracts away the underlying infrastructure, allowing developers to concentrate on the functionality rather than the setup. The serverless nature of Azure Functions means that there is no need to manage servers β the platform handles automatic scaling and infrastructure concerns, adapting seamlessly to varying loads and demands.
Automatic state management is one of the standout features of Durable Entities. Unlike the manual state management of the past, Durable Entities persist state automatically. They use event sourcing to ensure that the state can be reconstructed from its history without requiring manual intervention. This not only simplifies development but also reduces the potential for errors and makes state management more reliable and efficient.
Example implementation
Here is an example of a business process orchestration using Azure Durable Functions and Durable Entities. The example involves a time report submitted by an employee, which waits for approval from a manager. If denied, the employee can fix errors and resubmit until it gets approved and sent to a payout service.
Step-by-Step Explanation
- Orchestration Function: Manages the workflow of the time report process.
- Durable Entity: Represents the state of the time report.
- Approval Process: Simulates manager approval (external call).
- Payout Process: Simulates sending the approved report to a payout service (external call).
Durable Entity for Time Report
First, let's create a durable entity to hold the state of the time report:
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.DurableTask;
using Newtonsoft.Json;
public class TimeReportEntity
{
[JsonProperty("status")]
public string Status { get; set; }
[JsonProperty("reportDetails")]
public string ReportDetails { get; set; }
public void SubmitReport(string reportDetails)
{
ReportDetails = reportDetails;
Status = "Submitted";
}
public void ApproveReport()
{
Status = "Approved";
}
public void DenyReport()
{
Status = "Denied";
}
[FunctionName(nameof(TimeReportEntity))]
public static Task Run([EntityTrigger] IDurableEntityContext ctx)
=> ctx.DispatchAsync<TimeReportEntity>();
}
Orchestration Function
Next, we create the orchestration function to manage the process:
using System;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.DurableTask;
using Microsoft.Extensions.Logging;
public static class TimeReportOrchestration
{
[FunctionName("TimeReportOrchestration")]
public static async Task RunOrchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context, ILogger log)
{
string reportId = context.GetInput<string>();
EntityId entityId = new (nameof(TimeReportEntity), reportId);
bool approved = false;
while (!approved)
{
// Submit the report
string reportDetails = await context.CallActivityAsync<string>("GetTimeReportDetails", null);
await context.CallEntityAsync(entityId, "SubmitReport", reportDetails);
// Simulate manager approval process
string approvalStatus = await context.CallActivityAsync<string>("RequestApproval", reportId);
if (approvalStatus == "Approved")
{
await context.CallEntityAsync(entityId, "ApproveReport");
// Exit the loop
approved = true;
}
else
{
await context.CallEntityAsync(entityId, "DenyReport");
// The employee will correct and resubmit the report
// Wait for a period of time before checking for resubmission
var retryInterval = TimeSpan.FromHours(1);
await context.CreateTimer(context.CurrentUtcDateTime.Add(retryInterval), CancellationToken.None);
}
}
// Simulate sending the report to payout service
await context.CallActivityAsync("SendToPayoutService", reportId);
}
}
Activity Functions
Finally, we implement the activities for getting time report details, requesting approval, and sending to the payout service:
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.DurableTask;
using Microsoft.Extensions.Logging;
public static class TimeReportActivities
{
[FunctionName("GetTimeReportDetails")]
public static Task<string> GetTimeReportDetails([ActivityTrigger] string name, ILogger log)
{
// Simulate getting time report details
return Task.FromResult("Time report details for employee.");
}
[FunctionName("RequestApproval")]
public static Task<string> RequestApproval([ActivityTrigger] string reportId, ILogger log)
{
// Simulate manager approval process
// This could involve an external API call or manual process
string approvalStatus = new Random().Next(0, 2) == 0 ? "Approved" : "Denied";
return Task.FromResult(approvalStatus);
}
[FunctionName("SendToPayoutService")]
public static Task SendToPayoutService([ActivityTrigger] string reportId, ILogger log)
{
// Simulate sending the approved report to the payout service
log.LogInformation($"Report {reportId} sent to payout service.");
return Task.CompletedTask;
}
}
Example Usage
When we have deployed the above functions and entities to our Azure Function App, to start the orchestration, we can trigger it using an HTTP-triggered function or any other trigger mechanism, for example from a queue or an incoming file from a time reporting system:
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.DurableTask;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
public static class HttpStart
{
[FunctionName("HttpStart")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post")] HttpRequest req,
[DurableClient] IDurableOrchestrationClient starter)
{
// Generate a unique ID for the time report
string reportId = Guid.NewGuid().ToString();
// Start the orchestration
await starter.StartNewAsync("TimeReportOrchestration", reportId);
return new OkObjectResult($"Orchestration started with ID = '{reportId}'.");
}
}
This example demonstrates how to use Azure Durable Functions and Durable Entities to manage a business process involving a time report submission, manager approval, and subsequent payout. The orchestration function handles the workflow, leveraging durable entities for state management and activity functions for individual tasks. By adopting this approach, we can streamline complex business processes, ensuring they are both maintainable and scalable.
Enhancing Productivity and Onboarding
Developing with Azure Durable Functions and Durable Entities can simplify the implementation process, starting with the use of the async/await syntax in C#. This allows developers to write code that is straightforward to read and maintain, significantly reducing the complexity typically associated with implementing business process orchestration.
Durable Functions take this simplicity further by enabling declarative definitions of workflows and state machines. This means that business logic can be expressed in a clear, concise manner, making it easier for teams to understand and modify the workflow as needed. The declarative approach aligns well with how business processes are typically conceptualized, bridging the gap between technical implementation and business requirements.
Improved code quality is another major benefit of using Durable Functions. The structured nature of these functions makes it easier to write unit and integration tests, ensuring that the code is robust and reliable. Higher test coverage leads to fewer bugs and more confidence in the system's stability. Also, following Azure's best practices promotes a consistent approach across the team, resulting in code that is not only easier to maintain but also more reliable in the long run.
Onboarding new developers also becomes easier, thanks to the readability of the process orchestration code. New team members can quickly get up to speed with the code base, assuming they have experience with modern C# and dotnet. By leveraging these modern tools and approaches, teams can significantly boost productivity, streamline onboarding, and ensure that their code is of high quality, all while simplifying the development process.
Middleware for Security, Monitoring, and Error Handling
Middleware is a fundamental concept for both ASP.NET web server code and Azure Functions, providing a structured way to handle cross-cutting concerns such as security, monitoring, and error handling.
Security
For security, middleware can centralize authentication and authorization processes. This ensures all functions comply with security policies without requiring individual developers to implement these checks manually. One of the most common approaches is using OAuth for secure authentication and authorization. Azure Functions can integrate with Azure Entra Id (previously Azure Active Directory) to provide centralized identity management and role-based access control (RBAC).
For example, Azure Functions can use the Microsoft.Identity.Web library to facilitate authentication and token acquisition, enabling secure API calls on behalf of the authenticated user. Additionally, you can configure the functions to use the JWT Bearer token middleware, which validates tokens and enforces authorization policies before the function execute.
Development teams often write their own middleware for custom scenarios, such as logging user activity or enforcing additional security checks beyond what's provided out-of-the-box.
Monitoring
Monitoring is another critical aspect where middleware is useful. Azure Monitor collects and analyzes telemetry data from Azure Functions, providing insights into application performance and reliability. Application Insights, which integrates seamlessly with Azure Functions, offers detailed monitoring including request rates, response times, failure rates, and dependency tracking. This integration can be enabled with basically just a configuration setting, ensuring that all telemetry data is automatically collected and available for analysis.
For custom scenarios, developers might implement middleware to log specific business process details or to integrate with other monitoring tools. For instance, if Application Insights doesn't automatically track certain dependencies, developers can use the TrackDependency
API to manually log these dependencies, ensuring comprehensive monitoring coverage.
Error Handling
Middleware also plays a crucial role in error handling. Global exception handling middleware ensures that all exceptions are logged and managed consistently across functions. This simplifies the error-handling process and ensures uniformity. Built-in support for retries and circuit breaker patterns in Azure Functions helps manage transient faults and prevent cascading failures, enhancing the resilience of your application.
For example, a custom middleware could be written to catch all unhandled exceptions, log them to a central logging service, and return a standardized error response to the client. This approach ensures that all functions have consistent error-handling behavior without requiring repetitive error-handling code in each function.
Using middleware in these ways not only streamlines development but also ensures that essential aspects like security, monitoring, and error handling are robust and managed efficiently. By leveraging both built-in solutions from Microsoft and custom middleware, developers can focus more on business logic, confident that the foundational concerns are well taken care of.
Example implementations
Centralized Error Handling:
public class ErrorHandlerMiddleware
{
private readonly RequestDelegate _next;
private readonly ILogger<ErrorHandlerMiddleware> _logger;
public ErrorHandlerMiddleware(RequestDelegate next, ILogger<ErrorHandlerMiddleware> logger)
{
_next = next;
_logger = logger;
}
public async Task Invoke(HttpContext context)
{
try
{
await _next(context);
}
catch (Exception ex)
{
_logger.LogError(ex, "An unhandled exception occurred.");
context.Response.StatusCode = 500;
await context.Response.WriteAsync("An internal server error occurred.");
}
}
}
// In Startup.cs or Program.cs
app.UseMiddleware<ErrorHandlerMiddleware>();
Monitoring with Application Insights:
public class FunctionApp
{
private readonly TelemetryClient _telemetryClient;
public FunctionApp(TelemetryClient telemetryClient)
{
_telemetryClient = telemetryClient;
}
[FunctionName("MonitoredFunction")]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
_telemetryClient.TrackEvent("MonitoredFunctionInvoked");
// Business logic here
_telemetryClient.TrackEvent("MonitoredFunctionCompleted");
return new OkResult();
}
}
Conclusion
Building with Azure Durable Functions and Durable Entities enables a modern, cloud-native approach to implementing business processes. Compared to traditional development methods, this method offers enhanced productivity, faster onboarding, and improved code quality. By adding middleware for security, monitoring, and error handling, developers can focus on business logic while ensuring robust, secure, and maintainable applications. This shift not only simplifies development but also aligns with the dynamic needs of today's business environments, allowing for rapid adaptation and growth.
Leveraging cloud-native infrastructure also provides advantages. Azure's cloud infrastructure offers built-in redundancy, high availability, and automated scaling. This reduces the operational burden on developers or operations teams, who no longer need to worry about hardware failures or capacity planning. The cloud environment ensures that applications are resilient and can handle varying loads with ease, providing a more robust and reliable foundation for business processes.
Top comments (0)