DEV Community

Bernard Wiesner
Bernard Wiesner

Posted on

Laravel as a microservice using GCP PubSub, an alternative to laravel queue.

Laravel queue vs PubSub

Laravel queue is great and easy to use, however one big disadvantage is it's not microservice friendly. You need to couple your jobs with your app code and run your queue worker on the same repo.

For example lets say you have an e-commerce website and you want to process the payment in the background and then sent an email to the user if the payment was successful. You would need to write your job inside the same repo as your web app, and when you deploy to production, you would need to deploy both your web app and your batch server that is running the queue worker with the same repo code.

The above use case has various issues. Your web app code is tightly coupled to your background processing jobs. If you make some changes to your job it will add risks to your web app release since its in the same repo.

By using pubsub we can introduce a scalable microservice architecture that can solve above issues. Lets look how to implement this.

What is PubSub?

PubSub is a design pattern related to publisher and subscribers. You can publish to a topic and all subscribers that are listening on that topic will triggered. As an example:

  • topic: purchase-submitted
    1. subscription: send-processing-email
    2. subscription: charge-user

As you can see from above the topic purchase-submitted will be published from your web app when a user completes a purchase. There are 2 subscriptions subscribed to that topic, they will both get triggered at the time of publishing to the topic. First, is sending an email notifying the user their purchase is being processed. The second subscription will attempt to charge the user and if successful will send a purchase complete email. The subscriptions will get processed on a separate repo, we can call it a PubSub API repo.

Let's see how we setup this architecture using laravel.

GCP PubSub configuration

You need to have a GCP account and configure your topics and subscriptions on PubSub.

First create a topic:

pubsub create topic

Then create the subscriptions and link them to the topic above:

pubsub create subscription

Configure your subscription as push and enter your PubSub API endpoint (your laravel dedicated repo to process subscriptions):

pubsub configure subscription

You can configure additional options for your subscription, such as the acknowledge deadline and retry policy based on your preference or leave it as defaults.

Web app code

For publishing messages to GCP, you can use GCP's PubSub for PHP library, or you can use my library that acts as a wrapper around GCP's library. The benefit of my library is it provides a clean fluent API and allows you to easily mock and test your PubSub logic.

I will be using my library in this tutorial. Install it by doing:

composer require bernardwiesner/laravel-gcp-pubsub
Enter fullscreen mode Exit fullscreen mode

Then you can simply publish to your topic from your app code when a user submits a purchase request:

    use PubSub;
    // ...
    PubSub::topic('purchase-submitted')
    ->publish(['purchase_id' => $purchaseId]);
Enter fullscreen mode Exit fullscreen mode

That is all you need to do on your web app code. There is no need to have any of the purchase logic or email sending logic. That will all be on your PubSub API on a different repo.

PubSub API

You need to create a new repo that acts as an API. This API will be triggered by GCP pubsub when you publish to a topic.

Simply create your API endpoints inside api.php with the endpoint of your subscription as configured on GCP:

Route::post('/pubsub/send-processing-email', SendProcessingEmail::class);
Enter fullscreen mode Exit fullscreen mode

Then inside your controller you can process the jobs:

class SendProcessingEmail extends Controller
{
       public function __invoke(Request $request): HttpResponse
    {
        $message = $request->message;
        $data = json_decode(base64_decode($message['data']), true);
        $purchaseId = $data['purchase_id'];
         // your email sending logic

        return response()->noContent(204);
    }
}
Enter fullscreen mode Exit fullscreen mode

As you see above its required to base64 decode the message data, this is a requirement of GCP pubsub. Also on the last line a successful response code needs to be send back to GCP to mark the message as processed. If you do not respond with a success response code and the acknowledge deadline you configured on your subscription expires, the same message will be send again from GCP and your job will be triggered again until it succeeds.

For example, if you configured your acknowledge deadline as 10 seconds on your subscription, but your API for sending an email takes more than 10 seconds, your API will be called again from GCP. To avoid duplicate jobs make sure you configure your ack deadline on GCP pubsub with a reasonable time. Alternatively you could kill your php script if it passes over a certain time, for example:

    protected function registerTimeout(): void
    {
        pcntl_async_signals(true);
        pcntl_signal(SIGALRM, function () {
            posix_kill(getmypid(), SIGKILL);
        });
        pcntl_alarm(10);
    }
Enter fullscreen mode Exit fullscreen mode

The above code will kill your request if it takes more than 10 seconds, making sure you dont send an email twice. You should match your timeouts on php side with GCP side to avoid duplicate messages.

Delay messages

In laravel queue you can delay jobs, this is more tricky using GCP pubsub since the way pubsub works is it triggers all subscriptions at the moment you publish to the topic. However there are some workaround to this.

Using my library, you can use the delaySeconds() API to delay the processing of the job. This will add an additional attribute to the payload called available_at with the time the job is available to be processed:

    use PubSub;
    // ...
    PubSub::topic('your-topic')
    ->delaySeconds(30)
    ->publish(['your data'], [
        "your-attribute" => "your-value"
    ]);
Enter fullscreen mode Exit fullscreen mode

Then you need to add a middleware to your PubSub API repo to delay the job if its not yet ready by returning a non success response code:

    public function handle(Request $request, Closure $next)
    {
        $availableAt = (int) ($request->message['attributes']['available_at'] ?? 0);
        if ($availableAt > time()) {
            return response()->noContent(409);
        }
        return $next($request);
    }

Enter fullscreen mode Exit fullscreen mode

The subscription will be triggered again until the available_at is met and the API returns a success response code.

Conclusion

By using GCP PubSub and a dedicated laravel API for processing jobs you can scale your jobs and benefit from a microservice architecture.

Additionally you can add additional subscriptions to your topic and configure the push endpoints to any external API you might have, be it laravel or any other language. This is great for decoupling and scaling.

Top comments (1)

Collapse
 
waterloomatt profile image
Matt Skelton • Edited

What is this actually doing? Sending a 409 response to GCP to indicate the task was not consumed? So will GCP immediately push the task again, until the task is completed? Won't this spam your API with GCP requests, or is there a smart delay between attempts?

if ($availableAt > time()) {
     return response()->noContent(409);
}
Enter fullscreen mode Exit fullscreen mode