Ever needed a "fire and forget" solution where a client calls a C# WebAPI controller that does something long running, but doesn't really care about the result?
For example, say you have a job scheduling system that runs a series of tasks and one of them is a call to a C# WebAPI service to create something like an export of data from a SQL Server table to a file.
Problem
The table is large and the export will tie up the job scheduling system from running the rest of the tasks in the process, or timeout the job scheduling process. And the result of the WebAPI service doesn't mean anything to the end result of the process, with the WebAPI reporting errors in another way such as by email.
Solution
Make the C# WebAPI report a HttpStatus code of 200 OK and say that the request is complete to the calling client, while still continuing to work on the request in the background.
For this fire and forget functionality, in your WebAPI controller, do this:
public async Task<IActionResult> Post(string id)
{
try
{
return Ok("Request submitted successfully");
}
finally
{
Response.OnCompleted(async () =>
{
await DoProcessing(id);
});
}
}
The key is the "Response.OnCompleted" part, which allows your code to execute even after reporting HttpStatus 200 OK to the client.
Top comments (11)
While I think this is a neat solution, it could also be a really good illustration of where message queues are super useful. Receive the request, write a message to a queue, return a response, do the heavy processing asynchronously as the message is dequeued and processed. The nice thing in the message queue scenario is that if processing fails, it's pretty easy to retry by pushing the message back into the active queue from dead letter. In this example if "DoProcessing" fails, it'd probably have to be more hands on to try it again. Definitely has some potentially helpful uses though. Thanks for sharing.
Hi Ryan, thanks for your message. In this part of the system we had a pipeline that executed small tasks after a main process had finished executing to customise the results - and one task we needed to execute was really long running and it wasn't necessary to wait for it to finish (it produces a CSV file) to complete the pipeline. We definitely prefer to use message queues using NServiceBus and RabbitMQ in other parts of the system, for the reason you outlined.
I have a similar situation in my webApi. The returned value is an entity , which needs long processing as the SQL query is complex (could take even 10-20 seconds). Is this a viable solution to return a value in the finally branch?
Zoltan, I know this is way after the fact but I thought I'd post anyway. You might be able to get creative and push a response after the fact using something like SignalR. But I'm not totally sure if there would be issues trying to do it within a thread that's associated to an active HTTP pipeline request.
There are some other interesting things you can do by writing to the response stream in stages also. A few years ago I was curious about a multi-stage response where I wanted render a page back to the end user and continue updating it as different steps were completing before ultimately redirecting the response to somewhere else. This is how I did it.
At the top of my controller I defined these variables:
Notice that the first variable is basically a full static web page and the others are JavaScript blocks that fire functions defined in the static page to add steps to the output, display errors or redirect the browser.
This is what the controller method handling the call looks like:
Basically what's happening in the handler method is:
await Response.WriteAsync(<string here>)
every time I want to write something back to the browserDon't get too confused by the
DoAutoEnroll(...)
method. It's a method that takes severalFunc
definitions that are fired in the method definition depending on the required logic.The crux of this working is that the first
Response.WriteAsync(...)
needs to return a fully functional HTML document so something complete can be rendered to the end user's browser. Each subsequentResponse.WriteAsync(...)
actually appends a<script>
block to the bottom of the document and because it contains an immediate function, it fires right after it's rendered which allows me to inject content back into the main body of the HTML document. Before doing this, I didn't realize that you can actually write<script>
blocks after the</html>
tag and that they still work fine.Anyway, this is a weird one but for this very specific use case, I found it useful to keep the end user informed as I performed steps in a process and then finally write back the script block that redirects them to a different location.
SignalR is definitely more elegant but gets a little more sticky in a clustered environment. This solution is quick and a little unconventional but came in handy for me.
Hi Zoltan, sorry for the late reply. Unfortinately I don't think so. This method ha already returned the result by the time you would finish processing. It's really only for things like exporting data to a file or sending a message in a queue for later processing (where the output of the processing is not returned to the caller)
In my case, the processing time for the longrunning task was 1 hour :)
Looks like the ideal solution to what I need. What usings do you have to have to get these commands ( Ok(obj) and Response.OnCompleted) available? I'm in .NET Core 3.1 and have these two (below) already and can't get these methods to build.
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Http;
nm, I figured it out, I was foolishly editing the service file and not the controller file.
It still doesn't work how it seems like it should, I get the status code back almost immediately, but the response body still takes the same amount of time to return as it normally does.
That's the purpose of this... to get a response back immediately to the caller of the controller with a 200 OK response. At that point the request is done. The API still keeps processing the code though, so it's great for things that don't return a result to the caller, such as outputting data to a file on the network, or processing large database tasks where the database holds the result.
For example, we have used it at work where the request happens as part of a pipeline of requests. The pipeline times out if any of the requests takes too long, and we needed to kick off a request that was very long running - it generates data in a database and runs for many hours. We didn't need to wait for the request to be finished though, because none of the other requests in the pipeline need it's output. So we kick off the request, get 200 OK response back to the caller, and then keep processing the rest of the request pipeline. The API still processes the request though, generating the data in the database.
wow event gpt couldnt find this easy solution. I've tried and it works well. thanks a lot
Does this run on a new Threadpool thread after request is served?
I tried this solution but API is still waiting for process to complete before returning the response.