DEV Community


Posted on


Firestore (Native/Datastore) triggers Cloud Run


In Firebase Firestore, Firebase Functions can be triggered by writing a document in a specific path and perform arbitrary processing. I was wondering if it would be possible to do something similar to this by launching Cloud Run. I thought it would be a good idea, so I decided to write about it. Unless otherwise stated, I'll be talking in Native mode.

I was interested in it when I heard Eventarc's presentation at Google Cloud's Open Cloud Summit, which led me to actually touch it.


I want to be able to do at least the same things as in the case of Firestore triggers Functions.

  • Triggers can be set for specific collections.
  • Create/Update/Delete (or Write) as trigger events.


  • Satisfy above requirements
    • Firestore Data Access Audit Log → Cloud Logging Router Sink → Pub/Sub → Cloud Run
  • Not satisfy
    • Eventarc's Firestore Write trigger → Cloud Run
  • A forced method I used to use before Audit Log was released
    • Firestore Triggered Functions → Cloud Run

I will explain each method (except the last one).


First of all, Firestore's data access audit log was recently released in preview. The first and second methods are based on this. Note that this is a preview version and the specifications may change.

Using Eventarc

I'll start by explaining how I tried this and how I solved the problem that came up in this section. If you are not interested, please go to the next section.

Configuration and Operation

The official blog post about Eventarc is below. In a nutshell, you can use Audit Log or Pub/Pub topic as a source (trigger) and send events to Cloud Run with the information in CloudEvents format. Cloud Events is a specification of event data used in event-driven architecture.

To set up Eventarc, go to the Cloud Run service page. To configure Eventarc, go to the Cloud Run service page, open the triggers tab, and you'll see an "Add Eventarc Trigger" button that allows you to configure various settings. You can select the service or method to be triggered from among the many available ones, and specify which resource or region to be triggered and the service account to be used when invoking it. Finally, you can specify the path to this Cloud Run.

This is what happened when I logged the Body and Header of the request that came from Eventarc on the server I set up for testing. I'm omitting information that may not be necessary, such as YOUR_*. The following is the case of creating a document in /samples/:id from the console of Firestore (Native).

  "body": {
    "resource": {
      "labels": {
        "service": "",
        "project_id": "YOUR_PROJECT_ID",
        "method": "google.firestore.v1.Firestore.Write"
      "type": "audited_resource"
    "logName": "projects/YOUR_PROJECT_ID/logs/",
    "protoPayload": {
      "status": {},
      "requestMetadata": {
        "destinationAttributes": {},
        "callerSuppliedUserAgent": "YOUR_UA",
        "requestAttributes": {
          "auth": {},
          "time": "2021-09-17T01:51:32.185016Z"
        "callerIp": "YOUR_IP"
      "authenticationInfo": {
        "principalEmail": "YOUR_EMAIL_ADDRESS"
      "serviceName": "",
      "resourceName": "projects/YOUR_PROJECT_ID/databases/(default)",
      "serviceData": {},
      "methodName": "google.firestore.v1.Firestore.Write",
      "metadata": {}
        "@type": ""
      "authorizationInfo": [
          "granted": true,
          "permission": "datastore.entities.create",
          "resourceAttributes": {},
          "resource": "projects/YOUR_PROJECT_ID/databases/"
          "granted": true,
          "resource": "projects/YOUR_PROJECT_ID/databases/",
          "permission": "datastore.entities.update",
          "resourceAttributes": {}
      "request": {
        "database": "projects/YOUR_PROJECT_ID/databases/(default)",
        "@type": "",
        "writes": [
            "update": {
              "name": "projects/YOUR_PROJECT_ID/databases/(default)/documents/samples/ewMoAiRAA1t43S1PnSAv"
    "insertId": "-5jrcl3eizimu",
    "severity": "INFO",
    "timestamp": "2021-09-17T01:51:32.162978Z",
    "receiveTimestamp": "2021-09-17T01:51:32.436039856Z"
  "headers": {
    "ce-methodname": "google.firestore.v1.Firestore.Write",
    "accept": "application/json",
    "x-forwarded-proto": "https",
    "ce-id": "projects/YOUR_PROJECT_ID/logs/",
    "ce-dataschema": "",
    "traceparent": "00-8c6fc1640a9b6125909293da6aa41d0f-d0fbedd2fab6765c-01",
    "content-type": "application/json; charset=utf-8",
    "from": "",
    "ce-specversion": "1.0",
    "ce-subject": "",
    "forwarded": "for=\"\";proto=https",
    "host": "",
    "x-cloud-trace-context": "8c6fc1640a9b6125909293da6aa41d0f/15058891269448562268;o=1",
    "content-length": "1566",
    "accept-encoding": "gzip, deflate, br",
    "ce-recordedtime": "2021-09-17T01:51:32.162978Z",
    "ce-time": "2021-09-17T01:51:32.436039856Z",
    "ce-type": "",
    "ce-servicename": "",
    "user-agent": "APIs-Google; (+",
    "x-forwarded-for": "",
    "ce-resourcename": "projects/YOUR_PROJECT_ID/databases/(default)",
    "authorization": "Bearer YOUR_JWT"
Enter fullscreen mode Exit fullscreen mode


The body seems to contain the Audit Log of Firestore. The ce-* in headers seems to be specific to Cloud Events. The authorization is filled with JWT, so the application can validate it to determine if it is the right caller (if you allow unauthenticated).

Difficulty points

It is difficult to realize you can set a trigger for a specific collection. You may think that you can do it by specifying the Specific resource of the resource in the configuration screen, but this only filters for the protoPayload.resourceName of the request body. However, this only filtered for protoPayload.resourceName in the request body. In this state, all document write-related operations are requested to a single endpoint. "If you're thinking, "My service doesn't have many writes, so I'll handle it on the application side! I think this is fine.

This may be a problem specific to the current situation, but I think it may be improved in the future.

Using Cloud Logging's Log Router Sink (how it worked)

Since the contents of the above request body are flowing as an Audit Log, you can create a filter that matches the contents you want to pick up, send it to Pub/Sub, and then use Cloud Run to pick it up. The rest can be picked up by Cloud Run.

First, enable the Firestore Data Access Audit Log.

Next, create a Pub/Sub topic. Create a subscription to that topic, with the endpoint set to your own server, while making it a push type.

Finally, in Cloud Logging, create a Logging Sink. The topic should be the one you created in 👆.

The filter should be something like 👇 so that it can pick up only the Create in /samples/:id. The detailed notation is written in official here. (One annoying thing is that currentDocument.exists may or may not exist for each field, depending on whether it was created from the Admin or Client SDK.


protoPayload.methodName = "google.firestore.v1.Firestore.Write" AND =~ "projects\/YOUR_PROJECT_ID\/databases\/\(default\)\/documents\/dosamples\/[^/]+$" AND
(NOT protoPayload.request.writes.currentDocument:* OR protoPayload.request.writes.currentDocument.exists = false)
Enter fullscreen mode Exit fullscreen mode


If you can set it up so far, the following request body will be sent when you create a Firestore document.

"message": {
"data": "[Audit Log encoded in BASE64]",
"messageId": "3065956847068101",
"publishTime": "2021-09-17T05:51:40.637Z",
"publish_time": "2021-09-17T05:51:40.637Z",
"message_id": "3065956847068101",
"attributes": {
"": "2021-09-17T05:51:39.061628Z"
"subscription": "projects/YOUR_PROJECT_ID/subscriptions/YOUR_SUBSCRIPTION_ID"

That's it. Now you just need to create a sink for each trigger you need. I think it's a lot of work, so it's better to use Terraform to manage the code and add more by copy and paste.


I'm thinking, "Wouldn't it be easier to send Create/Update/Delete events to Pub/Sub when the logic runs from the Cloud Run application code?
→ That's exactly what I'm thinking (although if you want to run the application when the data here is created, this method is not a bad idea).

What I haven't tried yet

PubSub seems to have a function to filter messages. If you use this, you may not need any sinks. It depends on how detailed the filter is.

Top comments (0)

An Animated Guide to Node.js Event Loop

Node.js doesn’t stop from running other operations because of Libuv, a C++ library responsible for the event loop and asynchronously handling tasks such as network requests, DNS resolution, file system operations, data encryption, etc.

What happens under the hood when Node.js works on tasks such as database queries? We will explore it by following this piece of code step by step.