DEV Community

Cover image for OpenFaaS - deploying serverless functions to Docker Swarm via a CLI
Finnian Anderson
Finnian Anderson

Posted on • Edited on

OpenFaaS - deploying serverless functions to Docker Swarm via a CLI

OpenFaaS

Functions as a Service or OpenFaaS (lead by Alex Ellis) is a really neat way of implementing serverless functions with Docker. You can build out functions in any programming language and then deploy them to your existing Docker Swarm.

In this post we'll look at a CLI for making this process even easier.

Demo

OpenFaaS highlights

  • The only serverless framework for both Docker Swarm and Kubernetes
  • Easy to use - deploy in 60 seconds, UI and CLI built-in
  • Supports code or binaries in any language on Windows or Linux

Getting up and running

To run functions with OpenFaaS, you'll first need to deploy an OpenFaaS cluster to a Docker Swarm. This is really simple, just follow the steps over on the deployment guide.

How it works

The diagram below gives an overview of how the OpenFaaS function packages, the Docker image, and the faas-cli deploy command fit together.

The OpenFaaS CLI makes it easy to deploy your functions!

To deploy a function onto an OpenFaaS stack, you firstly must write the function itself. This is really easy and you can do it in any language which runs inside Docker (i.e. all of them). At the current time, the OpenFaaS CLI supports Python and Node, so for this example we'll use Python.

My very simple fib function is written in Python:

def fib(n):
    if n <= 1: return n
    else: return fib(n-1) + fib(n-2)

def handle(st):
    n = int(st)

    output = []
    for i in range(n):
        output.append(str(fib(i)))
    print(', '.join(output))
Enter fullscreen mode Exit fullscreen mode

This code is saved in sample/fib/handler.py. It's important to create a requirements.txt file in the same directory (even if your Python code has no requirements from pip) because otherwise pip install will throw an error.

Building the function package

OpenFaaS deploy overview

The way FaaS runs the function is by running this inside Python:

from function import handler
handler.handle(stdin)
Enter fullscreen mode Exit fullscreen mode

When you build a function with the OpenFaaS CLI, what it's doing is packaging up your function (handler.py) and building out the parts for Docker to run the container.

The last step is to write the stack.yml file which FaaS will use to build and deploy the function.

provider:
  name: faas
  gateway: http://localhost:8080 # your faas stack url

functions:
  fib:
    lang: python
    handler: ./sample/fib
    image: developius/func_python-fib
Enter fullscreen mode Exit fullscreen mode

To build the package, simply run:

$ faas-cli  build -f ./stack.yml
Enter fullscreen mode Exit fullscreen mode

Here is an example package for our func_python-fib function, located inside faas-cli's build directory:

$ tree build/fib/
build/fib/
├── Dockerfile
├── function
│   ├── handler.py         <-- called by index.py
│   └── requirements.txt
├── index.py               <-- Docker runs this
└── requirements.txt

1 directory, 5 files
Enter fullscreen mode Exit fullscreen mode

The package includes everything to allow Docker to run our function - Dockerfile, index.py and handler.py

Deploying the function

Now that our function is built, we need to deploy it. Before we do this however, we must push our function to the hub so that all the nodes in the swarm can access it.

$ docker push developius/func_python-fib
Enter fullscreen mode Exit fullscreen mode

Now for the deploy! This can be done with one command:

$ faas-cli deploy -f ./stack.yml
Enter fullscreen mode Exit fullscreen mode

When you run this command, the function will be deployed to the remote Docker Swarm. How cool is that?

Testing

The final thing to do is to test your function deployed successfully. You can do this via the OpenFaaS UI or via curl.

Testing via the FaaS UI

Or, via curl:

$ curl -d "10" http://localhost:8080/function/fib
0, 1, 1, 2, 3, 5, 8, 13, 21, 34
Enter fullscreen mode Exit fullscreen mode

High Availability via autoscaling

FaaS supports Docker Swarm's replication feature, which allows auto scaling of functions directly inside Docker. OpenFaaS makes use of this internally to make sure enough function replicas are available to deal with requests. It works by exporting metrics for Prometheus, which then generates alerts when thresholds are reached. The FaaS gateway will then scale up the functions when it receives these alerts.

This can be seen perfectly in the screenshot of my Grafana dashboard below.

Autoscaling within OpenFaaS

Recap

In this post we've seen how to package and deploy a function using the FaaS framework. We've looked at how to write a function, how to build it into a function package and deploy it to a remote Docker Swarm. Pretty neat stuff.

I've built a number of other functions and hope to expand this to include more soon. I wrote a function to shorten URLs, using a service I built, which received a PR from Richard Gee to merge in an armhf Dockerfile for use on Raspberry Pies. This was a big milestone for me as I've never accepted a PR for one of my projects before!

You can find out more about the projects I've been doing with Docker on my blog at finnian.io/blog - including my DockerCon 2017 writeup! Feel free to get in touch on Twitter @developius and I'd love to hear your thoughts below.

Props to Alex Ellis for creating such an awesome tool, I look forward to participating in the development of FaaS in the future. It's worth noting that FaaS is currently considered experimental but you can expect more language templates amongst some other neat features. First... learn Go. 🙈

Top comments (4)

Collapse
 
bengineer profile image
Ben James

Awesome article! Love the diagrams - they make it super clear :)

Collapse
 
musale profile image
Musale Martin

Getting to understand socket the more... Nice article.

Collapse
 
wannaknowmo profile image
wannaknowmo

love the article.
how would you parse multiple var with curl to a function?

Collapse
 
developius profile image
Finnian Anderson • Edited

I think the easiest way to do it would be to send the variables with JSON - like below, and then decode it in the function handler:

$ curl -d '{"var1":"thing","var2":"other thing"}' http://localhost/function/fib