In one of my previous posts I've discussed how to deploy Google Cloud functions using Azure Pipelines. However many of the developers are using Firebase Functions, which are the same Google Cloud Functions in disguise, but deployed using Firebase CLI. I'd like to see how we can do it in the context of Azure DevOps. But this time let's make it closer to reality.
Usually we'd have many environments: Development, Staging, Production, and deployment to each one is governed by multiple rules. UAT tests need to be passed, various approvals received etc. In Azure DevOps this problem is solved using "stages". For a long time "classic" Release Pipelines supported stages that could target deployments to different environments. A fairly new feature is multi-stage YAML pipelines. I wanted to evaluate this feature from several standpoints: whether it supports approvals for stages, can I see where the current build is deployed, can I re-deploy a previous build, and if unit and coverage tests results can be displayed.
With that in mind, let's build a multi-stage pipeline to deploy a Node.js-based Firebase function to GCP.
In comparison to pure Google Cloud Functions deploying Firebase is a breeze. We only need to install firebase-tools NPM package globally and issue one command, is you'd see below in the pipeline source. The only things we need to have is a GCP project name and a Firebase authentication token.
This command generates a token that can be used in CI/CD pipelines:
With that information we are ready to move to the Azure DevOps.
Here is the full pipeline source and then we can discuss different parts of it.
As you could see, the pipeline hierarchy is Stage/Job/Steps/Task.
The first stage is building, testing the project, and producing a deployment artifact. There is not much interesting here other than that stage represents a typical Continuous Integration pipeline in its entirety. Next stages correspond to the Continuous Deployment part.
Here some things are becoming more exciting. We can see that the "deployment job" can have several additional parameters, such as
- environment name - corresponds to the names on the "Environments" page. We'll talk about it later in the "Approvals" section.
- strategy - e.g. runOnce or matrix are useful in single- or multi-platform deployments. For every environment we could create a separate stage and provide different variables and deployment options. One of the first steps to do is to download an artifact that was published by the "build" stage. In that way we can ensure that each of the deployment stages uses exactly the same build package.
Very often we don't want to hard-code environment-specific parameters in your pipeline. That is where variables come to play.
Variables can be specified in several ways. One of them is "pipeline" variables. When editing the pipeline there is a "Variables" tab as you could see on a screenshot below, where you could add and edit the variables. In the pipeline you could refer to them via special syntaxis, for example as $(firebase-project)
Another way is specifying variables in the "Variables Groups" on the "Library" page. The advantage of this method is that we can secure the access to different groups, so, for example, developers would not be able to see the Production variables.
In this post we were able to use new YAML-based multi-stage pipelines to produce a combined CI/CD pipeline for Google Firebase functions deployment. We achieved all of the goals:
- unit testing
- test results
- reproducible builds
- environment-specific variables
- deployment approvals
- viewing where the build is deployed
However YAML pipelines still do not have a parity with the "classic" release pipelines. For example, the pipeline is stuck in "Pending" status until a final stage is approved and deployed, which feels awkward and leads to ridiculous build time displayed - one/two, or more days. Also, at a glance, we can only see which stages were completed, but we need to drill down to view the environment.
Hopefully, the multi-stage pipelines will grow and mature as the cat on the image below did.