DEV Community

Steve Frank
Steve Frank

Posted on

Migrate from Heroku to Azure

  • The overall goal was to get our staging site running 100% in Azure. This will all work for production, but you'll want to first do the work for a dev/qa/staging environment, and then use something like terraform to deploy production.
  • I went into this with zero experience in Azure. Please leave a comment for anything I should fix.
  • I'm writing this blog a week after I got things working, and I didn't take notes, so some parts are light on details.
  • After a few days into the work, I truly understood the simplicity that I get from Heroku, and the extra $ you spend is well worth it if you don't have a dedicated DevOps team.

Heroku

Here's our setup in Heroku:

  • Nodejs v14
    • Dynos for Web Requests
    • Dynos for background workers
  • Postgres + Redis
  • VueJS frontend
  • Custom domain, taking advantage of Heroku managing certificates
  • CodeshipCI for builds
  • Heroku Scheduler for cron

Example Procfile:

release: node_modules/.bin/sequelize db:migrate && node_modules/.bin/sequelize db:seed:all
web: node --optimize_for_size --max_old_space_size=1048576 index.js
worker: env WEB_MEMORY=1024 WEB_CONCURRENCY=2 node --optimize_for_size --max_old_space_size=1048576 worker.js
worker-pm: env WEB_MEMORY=2048 WEB_CONCURRENCY=2 node --optimize_for_size --max_old_space_size=2621440 worker.js
worker-p14gb: env WEB_MEMORY=8192 WEB_CONCURRENCY=2 node --optimize_for_size --max_old_space_size=8388608 worker.js
Enter fullscreen mode Exit fullscreen mode

Azure

  • I was given an Azure subscription to use, so I don't have instructions or experience w/ signup
  • I was nudged to use a Web App (even though when I was told this I had no clue what that meant), so that was the path I took.
  • In Heroku, we have 2 independently scalable systems: web and workers. I want to keep that when moving to Azure, thus each will have its own Web App.

Azure Resources

Notes:

  • Keep track of the region you start with and make sure to create them all in the same region.
  • There is an App Insights feature that I didn't look into, but I did enable it on everything I created. Probably easier to do it when creating instead of later...

Provisioning:

  1. Create a resource group for all the infrastructure to live in
  2. Create a Web App for each service; in our case we'll have one for web and one for workers. As part of the creation, you will need to create a Web App Plan for each, which is basically the same as the dyno type from Heroku. Choose one similar to what you had. (You can change the size and type later if you get this wrong.)
    • Just like with Heroku, you'll be able to tail logs and ssh in to help with debugging and monitoring.
    • I've only used the portal to do this; I have not tried the azure cli yet. You'll see "SSH" and "Stream Logs" in the left menu of the selected Web App.
    • Under Settings / Configuration, add a new Application Setting: OPENSSL_CONF = /etc/ssl/ to deal with openssl 1.1.1.d failing on the eventually provisioned web app containers.
  3. Provision a new Azure Database for PostgreSQL, appropriately sized and the correct version (or use latest). When and how to load the data from Heroku's database is up to you. Heroku's managed postgres databases don't give access to replicate, which is why I performed a final backup and then a restore. I'm also moving over our staging site so I can be a bit more cavalier with the process:
    • Perform a backup, and then use pg_restore to perform the restoration into the new Azure instance. If this was production, you'd want to put your app in maintenance mode before doing that backup.
    • On the Networking page of the new instance, you can grant your home's IP access to the db.
    • pg_restore --verbose --no-owner -h <server name>.postgres.database.azure.com -U psqladmin -d <db_name> heroku_database.dump
    • Note: If your provisioned username contains an @ like pgadmin@servername and you need to use that in a URI connection string, then encode it to pgadmin%40servername, so the resulting connection string is postgres://pgadmin%40servername:password@servername:port/dbname
  4. Provision Azure Cache for Redis
    • The Access Keys page has the connection string info
  5. Provision a Key Vault, and add your secrets
    • We'll use this for very secret stuff.
    • You can run heroku config -j --app <name> to get a JSON dump of your secrets from Heroku.
    • There are a few ways to get secrets to your Web Apps:
      • You can browse to the Web App, go to the Configuration page. and manually add or import the json (make sure to merge and not replace since Azure puts some settings there.)
      • Or you can have the release pipeline we'll make later pull settings from the key vault and push them into the Web App.
      • For simplicity, I went with just merging them into the Web Apps by hand (though I did use the Advanced method that exposes the JSON that can be cut/pasted).
      • Web Apps have a special section for storing connection strings (like for the database or redis) and for simplicity of this initial transition I chose to skip using it.

Azure DevOps - Build Pipeline

We're now going to replicate most of the magic that happens when you git push to Heroku.

  • Note: At pretty much every step going forward I will just assume you'll press the "Save" button.
  1. Switch over to Azure Devops, https://dev.azure.com/, and create a project if you don't have one yet.
  2. Push up your existing code. You're adding a new remote in your local git repo so you can still git push heroku main as well as now git push azure main (or whatever you call the remotes).
  3. Now we need a new Build pipeline. This is all the magic that Heroku does for you in their buildpacks. Create an azure-pipelines.yml file in the root of your repo and push that up to Azure so you can create the Build pipeline from it.
  • This pipeline handles a few things:
    • Caching two different npm's (i.e. node_modules) because the backend and client side are in the same repo.
    • Installing postgres and redis as services so our unit tests have real servers to work against.
    • It handles running and publishing mocha unit tests.
    • Dealing with phantomjs nonsense since the build image contains phantomjs so npm install doesn't download the binary, thus later preventing my Web App from having phantomjs. This was a full day of my life wasted, so you're welcome to anyone who else stumbles on this craziness.
    • And finally, we zip everything up and publish it, which I guess is similar to Heroku generating a 250MB+ slug. I take an extremely simplistic approach and zip it all. You could exclude tests for example. ALL of this will wind up on your Web App, so if you really can't have certain data on public facing servers, this is your chance to prune before archiving.

Azure DevOps - Release Pipeline

Now that we have the Build generating a giant zip file containing our entire codebase + node_modules + any static files created from npm build of our client, we can now deploy it to our Web Apps.

  • I'd love to share a json file to import the pipeline, but the one it generates is like 20k lines long and contains tons of unique guids, so instructions and screenshots it is!
  • Final pipeline: Image description
  1. Create a new empty Release pipeline

Artifacts

  1. Add an artifact Screenshot

Variables

Our database migration task below needs access to our database to run migrations and seeds. This lets us bring in secrets from our vault.

  1. Switch over to Variables and then select "Variable groups" Screenshot
  2. Click "Manage variable groups" and bring in the vars you need from the vault.
    • I don't get why some things can use underscores and others can't, so you'll see in my examples stuff named DB-HOSTNAME whereas the env var I need is DB_HOSTNAME
  3. Click "Link variable group"
    • I didn't realize I needed to do this and lost a few hours wondering why my variables weren't accessible. I want to give some Azure PM the evil-eyes for this.
  4. Switch to "Pipeline Variables" and add what you need. Screenshot

Note: When you are done with the next few steps you can come back and scope variables to certain stages.

Database Migration Stage

  1. Create a new stage call "Database Migration" Screenshot
  2. Click "Agent Job" and change Specification to "ubuntu-latest"
  3. Create an "Extract Files" tasks Screenshot
  4. Create a Node Installer tasks (just in case) Screenshot
  5. Create a Bash task. You'll need to tweak env vars to what your app expects Screenshot
  6. Create another Bash task (yes, you could just && with the task above instead of making a new one. I like having it be clear exactly which part broke.) Screenshot

Deploy Stages

We are going to deploy to web and workers in parallel.

  1. This stage's agent can run on windows-latest Screenshot
    • Startup command
    • The image Azure uses doesn't work with phantomjs, so we need to install some packages, and unzip our cached phantomjs from the build.
    • Again, you're welcome for knowing how to install custom fonts
apt-get update -qq && apt-get install libfontconfig libssl-dev -yqq && cp -r /home/site/wwwroot/.fonts /root/ && unzip -oqq /home/site/wwwroot/bin/phantomjs.zip -d /home/site/wwwroot/bin && node --optimize_for_size --max_old_space_size=2621440 index.js
Enter fullscreen mode Exit fullscreen mode
  1. Deploy Azure App Service web Screenshot
  2. Now duplicate this stage for workers. You may need to tweak the js file used to start.

Create a Release

Screenshot

Cron

I'll give migrating the Heroku Scheduler to Azure its own section. Since I used Linux Web Apps for everything, I guess I'm not allowed to create WebJobs? Kinda lame. Anyway, I saw a few approaches, most notably installing cron on your Web App during startup. Since I have more than 1 instance running, I don't want cron running more than once. I chose to use a Function App which allows me to define a function that runs on a schedule. This turned out to be rather useful.

  1. Provision a new Function App
  2. Create a new function, e.g. DailyTrigger, with a schedule like 0 0 6 * * * to run daily at 6am UTC.
  3. Write some code to do what you need. I went with C# for simplicity. Triggering cron for us is simply a POST to an internal endpoint. (There is code to ignore the cert because I haven't implemented SSL certificates in my Web App yet.)
    using (var httpClientHandler = new System.Net.Http.HttpClientHandler())
      {
        httpClientHandler.ServerCertificateCustomValidationCallback = (message, cert, chain, sslPolicyErrors) => {
          return true;
        };

        var webhookURL = "https://....";
        using var client = new System.Net.Http.HttpClient(httpClientHandler);
        client.DefaultRequestHeaders.Add("Authorization", "********");
        client.GetAsync(webhookURL).Wait();
      }

      log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
Enter fullscreen mode Exit fullscreen mode

Top comments (0)