One of my recent projects involved an application that had to be hosted on Google Cloud App Engine. I didn't use App Engine before, but I've managed apps on Heroku and OpenShift and was interested to see what Google Cloud PaaS had to offer.
It was a fairly standard Node.js application with most of the configuration done with environment variables. Soon it became clear that this could be a problem – App Engine does not support configurable environment variables.
Intentional or not, App Engine has only one way of defining those variables – in app.yaml
configuration file. This file describes App Engine settings (runtime, url mappings etc.), including env_variables
section that instructs App Engine to set environment variables on deployment.
Example app.yaml
file content:
runtime: nodejs10
handlers:
- url: /api/.*
script: auto
secure: always
- url: /.*
static_files: index.html
upload: index.html
secure: always
http_headers:
X-Frame-Options: deny
X-DNS-Prefetch-Control: off
X-XSS-Protection: 1; mode=block
X-Permitted-Cross-Domain-Policies: none
env_variables:
VAR1: 'VALUE1'
VAR2: 'VALUE2'
Our deployment pipeline was already fully automated, so I needed to store app.yaml
file somewhere to supply it to build server before pushing code to Google Cloud. Ideally, it would be application's code repository. However, having environment variables in app.yaml
file caused an issue: we either needed to commit application configuration to the repository, or leave the whole file untracked. Neither of these options was suitable, so I started looking for any other ways of dealing with this App Engine limitation.
As a side note, Heroku and OpenShift (at least its previous incarnation) have an option to set environment variables from the web/command line interface, which simplified application configuration management.
My search brought me some disappointing results:
- Store application configuration in the Google Cloud Datastore and read from it on application startup (link).
- Encrypt configuration values with Cloud KMS and commit together with the rest of the
app.yaml
configuration (link). - Use separate
app.yaml
files for different environments (and, I guess, commit them all to the repository?) (same link).
Option #1 assumed additional component in our infrastructure and vendor lock-in to Google Cloud Datastore database, which was far from ideal.
Option #2 solved the security part of the problem, but would mean hardcoding encrypted environment-specific values to the codebase. It would also require to update the code with each new environment added or any changes to the existing environment variables. Not ideal.
Option #3 didn't solve he problem at all – code would still store information about its environments and application secrets will be available right in the code repository...
Extra step in deployment pipeline
Eventually, I've came up with an approach that involved compiling app.yaml
file from template file during the build process. At that moment we used Google Cloud Build as a build server for CI/CD, but quickly moved to GitLab CI since Cloud Build does not support environment variables either.
To be fair, I should mention that Cloud Build supports "substitutions", which are remotely similar to environment variables. Unfortunately, the only way to pass substitutions to the build job is through command line arguments, which means managing environment variables somewhere outside. And this brings us back to the original problem...
The application was already using EJS library, so I used it to compile the template:
# app.tpl.yaml
runtime: nodejs10
env_variables:
SSO_CLIENT_ID: <% SSO_CLIENT_ID %>
SSO_SECRET: <% SSO_SECRET %>
with a script like that:
// bin/config-compile.js
const fs = require('fs');
const ejs = require('ejs');
const template = fs.readFileSync('app.tpl.yaml').toString();
const content = ejs.render(template, process.env);
fs.writeFileSync('app.yaml', content);
and GitLab CI step similar to this:
# .gitlab-ci.yml
config-compile:
stage: build
image: node:10
script:
- node bin/config-compile.js
artifacts:
paths:
- app.yaml
expire_in: 1 days
when: always
This enabled us to manage application configuration in GitLab environment variables.
The approach can easily be adapted to any other templating library, programming language or build server, the code above is just an example.
A note on security
I found it interesting that one of the oldest Google Cloud products doesn't support such a common functionality. However, I accept there could be valid security reasons for doing so, e.g. exposure to dependency vulnerability similar to the one discovered in rest-client or typo-squatting like in npm registry. This is especially relevant as App Engine does not provide options to limit/manage outgoing connections from the environment.
Alternative solutions not covered in this post
- "Secrets in Google App Engine" by Stuart Leitch
- "How to use Environment Variables in GCloud App Engine" by Gunar Gessner
Top comments (9)
Just adding to the Cloud Build part; substitutions don't have to be passed as command line args. You can also define them within a Cloud Build Trigger within your project.
The Trigger can then be set to run automatically on repo changes, or manually using 'gcloud beta builds triggers run TRIGGER'.
There's still a problem here of course, that the substitutions are read into your cloudbuild.yaml file and not your app.yaml file... but in similar way as you used templates here, this can be achieved in Cloud Build using the envsubst community-builder like so:
cloudbuild.yaml
steps:
- name: 'gcr.io/$PROJECT_ID/envsubst'
args: ['app.yaml']
env: ['_ENVIRONMENT=${_ENVIRONMENT}']
[...]
app.yaml
[...]
env_variables:
ENVIRONMENT: ${_ENVIRONMENT}
In the cloud build I don't see an option to connect with gitlab, its only for github or bitbucket
For those with no EJS installed, you can use this render function:
This is a great tip, thanks!
I'm not clear on something, though, what do you use to re-template the app.yaml file as your build moves through environments?
Hi Adam,
Each environment has its own set of environment variables set in the CI, so when I trigger build for a specified environment it picks up relevant environment variables and builds code for them.
Let me know if it makes sense
The thing is though, in an ideal world you shouldn't be building for each environment, which is what I was driving at - build once and move that same build through environments. I've done this on CircleCI, but I'm far from a devops expert and how to move the same build through environments using triggers escapes me. If anybody can provide any insight, that'd be super!
It is generally a good idea to build artifact once and then move it across environments (e.g. Docker image).
However, it is not always possible to do so, e.g. if you use Create React App Environment Variables to point to correct API endpoint etc.
In the case of this article, we are also limited to what App Engine Standard Environment provides us with.
I am think of just use Docker and Google Cloud Run, but I am not totally sure if it is worth it?
I've just recently used it on one of the projects - it is definitely an improvement in comparison to App Engine in terms of environment variables setup.
Cloud Run is a relatively new service on GCP, so some features are still not available on it, e.g. connecting it to a load balancer (or providing similar functionality).
I found this repository by @ahmetb useful when working with it: github.com/ahmetb/cloud-run-faq