Originally posted at Serverless on February 22nd, 2017
My last post showed you how to use Mocha to automate endpoint testing for a service with multiple methods created and deployed using the Serverless Framework. It’s possible for Test-Driven Development (TDD) to be practiced in the serverless world from a command line on a developer’s local machine. But what if you have a team of developers that are constantly merging branches back into master and you want to set up automated testing and deployment using a Continuous Integration/Continuous Deployment (CICD) toolchain? Keep reading and you’ll find out!
Here we’re still using the same Todo list example the folks at the Serverless Framework created as our codebase. But with some variations so that it more cleanly supports automated testing and the CICD toolchain used — AWS CodePipeline.
Code Differences From The Original Todo
In Part 1, I neglected to get into the details of what I had to change for the original Todo codebase to get it to function more cleanly for automated testing. Let’s explore that here.
First, two of the five methods in our service perform writes. Specifically create.js and update.js. The issue with automating the testing, especially for the create, is that the original version wasn’t returning the UUID for the newly created Todo. That meant in order to verify that the write occurred correctly, testing code would have to do a list and scan for matching Todo content.
The first change, then, is to return the entire JSON of the newly created Todo. For clarity, I kept the old code commented out, so the new lines 38–44 look like this:
For consistency’s sake, the same was done for the update.
Next, the original code hard-coded the DynamoDB table name in every method handler and again in serverless.yml for the creation of that table. It's possible to deploy multiple versions of your service on different branches with the same AWS account where one is your working copy if you use the local execution method on one while the other is based on a master branch that is executing in AWS CodePipeline. You just need to be a little more creative with the table naming mechanic.
In the method handlers, in the constant set up to pass parameters to DynamoDB, you’ll see a change similar to this one found in the create handler:
So now, the database table name gets pulled from the TABLE_NAME environment variable, which is getting set in the serverless.yml file based on the stage defined for the deployment:
I’m really liking the relatively new syntax for multiple serverless.yml variable references for a single evaluation, BTW.
Creating the CodePipeline and Explaining the AWS CodeBuild buildspec.yml file
I chose to use AWS CodePipeline since it was newly announced at AWS re:Invent in December. The CodePipeline Execution readme in my repo describes how you can set that up step-by-step. Future versions will automate this set up, but CodePipeline is new enough and the oAuth integration with GitHub wasn’t straight forward to script. So for now I’ve got a lot of screenshots for a manual process instead.
At the center of the automation is AWS CodeBuild and its buildspec.yml file. In our example, that file looks like this:
Here we’ve defined three of the standard phases that CodeBuild supports: install, build, and post_build. From steps performed in the Local Execution from last time, the commands for each phase should look familiar. The various dependencies are set up during install.
During build, the Serverless Framework command line is used to deploy our service with a stage called “cicd” that shouldn’t name clash with the default “dev” most likely used during Local Execution. The results are piped to deploy.out so that the endpoint name can be picked up by the post_build testing script that then runs the same Mocha tests as before.
Results and Gotchas
CodeBuild provides excellent detailed logging via CloudWatch — although it takes a couple of clicks to get there. The most likely causes of failure have to do with CloudFormation failing for one reason or another. I found that when that happens, an unfortunate side effect is that you have to manually delete the CloudFormation stack and possibly the DynamoDB table.
Once over that hump, though, you can simply check changes into the branch you associated with your CodePipeline and all the automation kicks in to test and deploy your service!
Originally published at https://www.serverless.com.
Top comments (0)