In this tutorial, we will cover how we can leverage Appwrite’s Cloud functions feature to execute certain tasks when certain events take place in the server. You can find a complete list of available system events here.
In this example, we will demonstrate how we can integrate with a third-party storage provider like Dropbox to create backups of files uploaded to Appwrite. For the sake of this example, we will be using Dropbox’s Python SDK. A similar concept applies to other API providers like Box or Google Drive. So let’s get started.
The first step is to create a Dropbox Developer account and obtain the Access Token. Now it's time to create the Cloud Function in the Appwrite Console. Head over to the Functions section of your console and select Add Function. You can give your function a funky new name and select the preferred environment. We will be using Python for this example.
The next step is to write the code that will be executed and upload it to the Appwrite Console. Create a directory to hold your Cloud Function. Then create your main code file and a
$ mkdir cloud-functions-demo $ cd cloud-functions-demo $ touch main.py $ touch requirements.txt
We will be using two dependencies for this example
Add these to your
requirements.txt. We would typically perform a
pip install at this stage but that would install the libraries in the shared libraries path. We need the libraries to be installed in the same directory so that they can be packaged easily. Run the following command to install the libraries inside the local
.appwrite directory. Appwrite’s Python environment will know how to autoload a file from that directory without any special configuration.
$ PIP_TARGET=./.appwrite pip install -r ./requirements.txt --upgrade --ignore-installed
Great. It’s time to start editing the
main.py file. We start by importing the relevant libraries.
import sys import json import os # Drop box SDK import dropbox from dropbox.files import WriteMode from dropbox.exceptions import ApiError, AuthError # Appwrite SDK from appwrite.client import Client from appwrite.services.storage import Storage
Appwrite's API returns a binary file as output whereas the Dropbox SDK expects a file path. So we will make use of a temporary file during the function execution. So let's define those variables.
TOKEN = os.environ['DROPBOX_KEY'] FILENAME = 'my-file.txt' BACKUPPATH = '/my-file.txt'
Now it’s time to set up the Appwrite SDK
# Setup the Appwrite SDK client = Client() client.set_endpoint('http://192.168.1.6/v1') # Your API Endpoint client.set_project('5fca866c65afc') # Your project ID client.set_key(os.environ["APPWRITE_KEY"]) # Your secret API key
Note: Within the Cloud Function, you cannot use localhost to refer to your Appwrite server, because localhost refers to your own runtime environment. You will have to find the private IP of your default network interface using ifconfig (usually eth0 in Linux or en0 in macOS).
When a function is triggered by an event, we can obtain a lot of metadata about the event from some special environment variables that are exposed by Appwrite. A complete list is available here. In our case, we need the ID of the file that was uploaded, in order to fetch it. Appwrite conveniently exposes this information as an environment variable named
APPWRITE_FUNCTION_EVENT_PAYLOAD. Let’s parse this JSON string to retrieve the file ID.
# Triggered by the storage.files.create event payload = json.loads(os.environ["APPWRITE_FUNCTION_EVENT_PAYLOAD"]) fileID = payload["$id"]
Using the SDK, lets fetch the file and save it:
# Create an instance of Appwrite's Storage API storage = Storage(client) result = storage.get_file_download(fileID) # Save the file with open(FILENAME, "wb") as newFile: newFile.write(result)
We’re almost done. Now we will set up the Dropbox SDK and upload the file.
# Check if we have the access token if (len(TOKEN) == 0): sys.exit("ERROR: Looks like you didn't add your access token. " "Open up backup-and-restore-example.py in a text editor and " "paste in your token in line 14.") # Create an instance of a Dropbox class, which can make requests to the API. print("Creating a Dropbox object...") with dropbox.Dropbox(TOKEN) as dbx: # Check that the access token is valid try: dbx.users_get_current_account() except AuthError: sys.exit("ERROR: Invalid access token; try re-generating an " "access token from the app console on the web.") # Create a backup of the current settings file backup() print("Done!")
Let’s now take a look at the
backup() function. Here we use Dropbox’s
files_upload() function to upload our file and watch for some specific errors.
# Uploads contents of FILENAME to Dropbox def backup(): with open(FILENAME, 'rb') as f: # We use WriteMode=overwrite to make sure that the contents in the file # are changed on upload print("Uploading " + FILENAME + " to Dropbox as " + BACKUPPATH + "...") try: dbx.files_upload(f.read(), BACKUPPATH, mode=WriteMode('overwrite')) except ApiError as err: # This checks for the specific error where a user doesn't have # enough Dropbox space quota to upload this file if (err.error.is_path() and err.error.get_path().reason.is_insufficient_space()): sys.exit("ERROR: Cannot back up; insufficient space.") elif err.user_message_text: print(err.user_message_text) sys.exit() else: print(err) sys.exit()
Before we can deploy our cloud function, we need to ensure that our directory has the following structure.
. ├── .appwrite/ ├── main.py └── requirements.txt
There are two ways to deploy your function. Using the Appwrite CLI and using the Appwrite Console.
You can easily deploy your functions using Appwrite CLI. If you have not already installed Appwrite CLI, please go through these instructions to install Appwrite CLI. Once installed, you can run the following command from the directory containing your cloud function to deploy your tag.
appwrite functions createTag \ --functionId=<id> \ --command='python main.py' \ --code='.'
The function ID can be found on the right side of the overview section of your function.
If deploying manually, we need to first package the function by creating a tar file out of our folder.
$ cd .. $ tar -zcvf code.tar.gz cloud-functions-demo
We can now upload this tarfile to our function’s dashboard by selecting the Deploy Tag > Manual option. Our entry point command, in this case, would be:
$ python main.py
Once created, we need to define a trigger for the function. In our case, we wish to trigger it whenever a new file is uploaded to the Appwrite server. So we would be interested in the
storage.files.create event. The trigger can be enabled under the Settings tab of the function.
Once the triggers are enabled, it’s time for our final step, Function Variables. Appwrite allows you to securely store secret keys using Appwrite Function Variables which will be available as environment variables to your program. The best part is that these keys are encrypted and stored securely on Appwrite’s internal DB. In this example, we have used two environment variables namely DROPBOX_KEY (Dropbox’s API Key) and APPWRITE_KEY (Appwrite API Key) so let’s add them to the Function Variables. Don’t forget to click the Update option once you’re happy with your settings.
Great! We’re done with all the setup. All that’s left now is to test the Cloud Function.
Now it’s time to test your shiny new Cloud Function! Head over to the Storage section of Appwrite and create a new file by clicking on the ‘+’ button at the bottom right. Choose a text file ( Or any other file. But be sure to rename the files in the code example appropriately. ) and click Create.
Your Cloud Function would now have been triggered. You can check it out by heading over to
Functions > Your Function Name > Logs
Once the execution is complete, you can check the response from the API.
And in a few simple steps, we successfully deployed our first Cloud Function. The possibilities with Cloud Functions are endless! Stay tuned for more Cloud Function ideas from the Appwrite Team.
- You can find the complete code sample and lots of other demos in our Cloud Functions Demo Repo.
- Check out Appwrite's Github Repo.
- Our Discord Server is the place to be if you ever get stuck.
- You can find all our Documentation here.