This is a simple DBOS example focusing on remote deployment to DBOS Cloud, their hosted solution with a generous free tier for devs.
The github repo sets up: 1) a workflow with an http api endpoint that receives events from Supabase when an INSERT is made to a table, 2) the same workflow archives the event payload to a DBOS hosted Postgres table, and 3) additional endpoints as utilities to view and delete all events while in development.
Prerequisites
Make sure you have node.js 21.x
Sign up for DBOS Cloud (https://www.dbos.dev/dbos-cloud)
Have a Supabase account and a table you are inserting in to.
Getting Started
Clone the repository and navigate to the project directory
-
Install the dependencies
Be sure not to commit / hard code your secrets to a public repo! This setup is meant for local development and direct deployment to the dbos-cloud service.
To deploy to DBOS-cloud, login with this, "npx dbos-cloud login" and follow the instructions to match the uuid given in the console to the one in the browser, then standard login user/password applies.
Next provision a d'base instance: "npx dbos-cloud db provision database-instance-name -U database-username"
Register your app with the d'base instance: "npx dbos-cloud app register -d database-instance-name"
To use secrets in DBOS, add your variables in the cli like this:
export PGPASSWORD=put-the-password-you-created-when-you-setup-the-remote-database-here
These will be picked up at build time and inserted into the dbos-config.yaml fields: ${PGPASSWORD}.
You will notice on line #56 of operations.ts, there is a @PostApi decorator which sets up a url with a randomly generated endpoint. This will be the receiver of events from Supabase. I created the randomly generated endpoint to protect the api (sometimes called protection by obfuscation).
**Note: this is not a recommended method! It was done as a quick and dirty dev hack. For real production use, you will want to use DBOS' authentication / authorization features **
Take the base url that DBOS returned when the deploy finished and add the randomly generated endpoint you create (instructions are in the code).
It should look something like this: https://foo-bar-dbos.cloud.dbos.dev/OBstAqG6qOv7cWXCqgg. You'll use this in the next step.
The Supabase settings
On the Supabase side, set up the trigger to publish changes externally by: 1) choosing the project you want to use, then on the left hand side menu go to "Database" --> "Webhooks". 2) create a new webhook, give it a name, choose the table to watch, and the events (in our case insert, but update and delete are also available). Make sure the http request method is set to POST, then enter the URL created by DBOS-Cloud plus the randomly generated stub. Hit "Create Webhook".
Staying in the Supabase console, go to the "Table Editor", choose your table, and add a new row with the required inputs.
Almost immediately, you should be able to query in Insomnia / Postman or curl given your specific DBOS supplied url, https://foo-bar-dbos.cloud.dbos.dev/getEventData, and see the event.
For reference, here is the format of a sample event body sent to DBOS-Cloud:
{
"type": "INSERT",
"table": "fooData",
"record": {
"id": "tempId-e71m1dtxan6b5c0dqxkbhoez",
"email": "joe@company.com",
"lastName": "Blow",
"firstName": "Joe",
"created_at": "2024-07-25T06:03:10.792595+00:00",
"updated_at": "2024-07-25T06:03:10.792595"
},
"schema": "public",
"old_record": null
}
Record fields will vary based on how your table is structured. For simplicity purposes, in this example, I archive the JSON object in total vs. parsing the records further.
Reference Docs (From Official Repo)
- To add more functionality to this application, modify
src/operations.ts
, then rebuild and redeploy it. - For a detailed tutorial, check out our programming quickstart.
- To learn how to deploy your application to DBOS Cloud, visit our cloud quickstart
- To learn more about DBOS, take a look at our documentation or our source code.
Resources to learn more --
The first article in this series uses cron and Postmark email service provider.
The second article in this series aggregates and archives data from Supabase tables.
Top comments (0)