loading...
Cover image for OpenAPI with Terraform on AWS API Gateway

OpenAPI with Terraform on AWS API Gateway

rolfstreefkerk profile image Rolf Streefkerk ・10 min read

TL;DR

You just want to dig into the source code, look here on Github. The README.md contains instructions on how to run it. See this section for a brief solution break-down.

If you're interested in how I deployed this solution with Terraform, go here.

If you're interested in the testing aspects, go here.

Thanks for reading and till next week!

Topic break-down

OpenAPI with Terraform on AWS API Gateway

Last week we discussed the why's of OpenAPI and its tooling support. Today, I'll demonstrate and answer the following topics;

The key principles here are;

  • Automation; we want to codify as much as possible, any manual interaction means we introduce possible errors due to inconsistency.
  • Single source of truth; The fewer documents we need to maintain, the higher the likelihood the information is current and consistent.

A brief introduction to the code base

If you like to follow along with the code, please look here on Github. The README.md contains instruction on how to run it.

The structure is as follows:

  • ./env/dev is where the environment specific Terraform code resides. This is for the development environment. Similarly, you would duplicate these files for test and production.
    • This way you can store the state files in their own AWS accounts specific to the environment, and you can create different configurations for each environment (e.g. in the dev.tfvars file).
  • ./modules contains all the base Terraform modules that are composed in the ./services folder
  • ./node contains;
    • All the AWS Lambda handler/lib Node12 code in ./node/src
    • Node command-line programs in ./node/cmd.
    • JSON Schema file generated from the OpenAPI specification in ./node/schema
  • ./services contains Terraform files that specify the services this solution is composed off using the base AWS Cloud services in ./modules.
    • The OpenAPI document is located in ./services/api, and that is the base document for AWS API Gateway and the JSON Schema file discussed earlier.
  • ./gulpfile.js contains the code to create the Zip file artifacts from the ./node/src codebase and deploys that with Terraform.

What is the OpenAPI specification?

Lets get right into it with the OpenAPI document structure before I explain the individual parts in more detail. Or skip to the next subsection.

OpenAPI documents will at least have these major parts (top hierarchy);

  • openapi, indicates the version,
  • info, describes general information about the owner and license agreements (ToS),
  • servers, where to connect to,
  • paths, the available endpoints,
  • components, specific models (JSON Schema) that apply to the endpoints and security definitions.
  • tags, used to create a human readable structure (you can group endpoints together with the same tag) out of the document with descriptions.
  • There's a concept of extensions (e.g. x- custom parameters) in the specification, we'll see examples of that shortly, that allows for anyone to extend on to the official OpenAPI document without breaking the validation of the document. More on this here.

Now that we have a general sense of the document layout, let's look at how we create paths (endpoints).

How to create a REST API endpoint?

Lets consider the following endpoint;

POST /identity/authenticate

paths:
  /identity/authenticate:
    post:
      operationId: identityAuthenticate
      description: Authenticate user (either login, or continue session)
      requestBody:
        required: true
        content:
          application/json:
            schema:
              $ref: '#/components/schemas/Authenticate'
      x-amazon-apigateway-integration:
        uri: "arn:aws:apigateway:${region}:lambda:path/2015-03-31/functions/${lambda_identity_arn}/invocations"
        passthroughBehavior: "when_no_match"
        httpMethod: "POST"
        timeoutInMillis: ${lambda_identity_timeout}
        type: "aws_proxy"
      responses:
        200:
          $ref: '#/components/responses/lambda'
        500:
          $ref: '#/components/responses/500APIError'
      tags:
        - Identity

There's a lot to unpack here, lets start from the top:

  • operationId is used to differentiate between the API operations. This name has to be unique for this document. In this code base it's used to detect in AWS Lambda which operation is executed. More on this when I explain the custom AWS x- parameter.
  • requestBody specifies which POST object it expects and if it's required.
    • The $ref refers to the JSON Schema further down this document. This keyword allows you to reference other parts of the specification without duplicating the same content.
  • x-amazon-apigateway-integration is a custom AWS parameter that is used to define the integration with, in this case, AWS Lambda.
    • This parameter is required for each operation.
    • It specifies which AWS Lambda function it's integrated with via the ${lambda_identity_arn} parameter that is set by the Terraform scripting.
    • It specifies the maximum timeout of the integration with ${lambda_identity_timeout} parameter.

Lets say that we want to execute this API from another domain, how do we do that?

What about CORS?

This is a typical nuisance with REST API's, how to get cross origin requests to work.

With OpenAPI and AWS API Gateway this is relatively simple, there are 2 steps:

  • First, add an options to the endpoint paths that you want to execute cross domain:
    options:
      responses:
        200:
          $ref: '#/components/responses/cors'
        500:
          $ref: '#/components/responses/cors'
      x-amazon-apigateway-integration:
        responses:
          default:
            statusCode: "200"
            responseParameters:
              method.response.header.Access-Control-Max-Age: "'7200'"
              method.response.header.Access-Control-Allow-Methods: "'OPTIONS,HEAD,GET,POST,PUT,PATCH,DELETE'"
              method.response.header.Access-Control-Allow-Headers: "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token'"
              method.response.header.Access-Control-Allow-Origin: "'*'"
        passthroughBehavior: "when_no_match"
        timeoutInMillis: 29000
        requestTemplates:
          application/json: "{ \"statusCode\": 200 }"
        type: "mock"
  • Second, in the AWS Lambda handler code we need to make sure the response contains the following header
{
  isBase64Encoded: false,
  statusCode: 200,
  headers: {
    'Access-Control-Allow-Origin': '*'
  },
  body: JSON.stringify(message)
}

Now we should be able to create an endpoint that is cross origin executable. How does the validation and specification of input and output work?

Component schema specification

The component section of the specification is all about the data, what comes in and what goes out.

We have 3 variants to specify;

  • parameters: These are used in GET requests for instance, they appear as query parameters in the URL, e.g. /user/2
  parameters:
    userID:
      description: User identifier
      in: query
      name: userID
      schema:
        type: string
      required: true
  • schemas: Used in POST requests for example, and are often referenced in an API response. The required , type , and pattern definitions can all be used to validate your API input and output.
    • More on this later when we cover JSON Schema validation
  schemas:
    Register:
      title: Register
      type: object
      description: Registration form
      required:
        - email
        - password
        - username
        - firstName
        - lastName
      properties:
        email:
          type: string
          example: "user@business.com"
          pattern: "^[_A-Za-z0-9-\\+]+(\\.[_A-Za-z0-9-]+)*@[A-Za-z0-9-]+(\\.[A-Za-z0-9]+)*(\\.[A-Za-z]{2,})$"
        password:
          type: string
          example: "123$SFF22l"
        username:
          type: string
          example: "testUsername"
        firstName:
          type: string
          example: "Rolf"
        lastName:
          type: string
          example: "Streefkerk"
  • responses: Below you can see the schema reference, or if we want we can explicity define the JSON Schema here like in schemas.
  responses:
    user:
      description: User
      content:
        application/json:
          schema:
            $ref: '#/components/schemas/User'

How do we take care of secured endpoints?

All of what we have seen so far can be accessed by the public at large, how do we secure an endpoint with authentication?

In this example I've used AWS Cognito as the authentication service and it integrates really well with API Gateway.

Specify the following details:

  • Name the security definition, e.g. example-CognitoUserPoolAuthorizer
  • Auth type is cognito_user_pools
  • Authorizer is the actual AWS Cognito instance identified by its unique ARN (Amazon Resource Names) which is supplied by Terraform at deploy time.
securitySchemes:
  example-CognitoUserPoolAuthorizer:
    type: "apiKey"
    name: "Authorization"
    in: "header"
    x-amazon-apigateway-authtype: "cognito_user_pools"
    x-amazon-apigateway-authorizer:
      providerARNs:
      - "${cognito_user_pool_arn}"
      type: "cognito_user_pools"

Now to enable a secured endpoint, you will have to add the security parameter as follows;

/user:
  get:
    operationId: getUser
    description: get User details by ID
    parameters:
      - $ref: '#/components/parameters/userID'
    security:
    - example-CognitoUserPoolAuthorizer: []
    ...

How to deploy OpenAPI endpoints with Terraform to AWS?

If you're new to Terraform please stay tuned for a more in-depth Terraform how-to.

Briefly we discussed the directory layout in the introduction. A good structure is crucial with Terraform to ensure there's as little code duplication as possible, and the right information is stored in the right places.

My preferred structure is this:

  • env code and state that is specific to the environment we deploy to.
    • We do not want to mix in state from production in a development account for instance.
    • We definitely do not want to commit state to Git. Use an encrypted shared storage such as AWS S3.
  • modules to re-use the code, create modules for each of the major services used and preferably version these such that different projects can run different versions and upgrade independently. For this demo, they're just included in the same repo.
  • services here the infrastructure is composed with modules and we store the OpenAPI definition file here as part of our solution.

The API Gateway module

To build the OpenAPI integration, we need to feed the document into a aws_api_gateway_rest_api resource. This is done in the apigateway module as follows (an excerpt):

data "template_file" "_" {
  template = var.api_template

  vars = var.api_template_vars
}

resource "aws_api_gateway_rest_api" "_" {
  name           = "${local.resource_name_prefix}-${var.api_name}"
  api_key_source = "HEADER"

  body = data.template_file._.rendered
}

When this module is called in ./env/dev/main.tf Terraform file. We need to specify the template variables that we've already seen in the OpenAPI document (e.g. ${cognito_user_pool_arn})

module "apigateway" {
  source            = "../../modules/apigateway"
  resource_tag_name = var.resource_tag_name
  namespace         = var.namespace
  region            = var.region

  api_name                   = local.api_name
  api_throttling_rate_limit  = var.api_throttling_rate_limit
  api_throttling_burst_limit = var.api_throttling_burst_limit
  api_template               = file("../../services/api/${local.api_name}.yml")
  api_template_vars = {
    region = var.region

    cognito_user_pool_arn = module.cognito.cognito_user_pool_arn

    lambda_identity_arn     = module.identity.lambda_arn
    lambda_identity_timeout = var.lambda_identity_api_timeout

    lambda_user_arn     = module.user.lambda_arn
    lambda_user_timeout = var.lambda_user_api_timeout
  }

  lambda_zip_name = local.lambda_zip_name
  dist_file_path  = local.dist_file_path
}

The Services integration

You'll notice that there is mention of module.identity.lambda_arn and module.user.lambda_arn. These are the two services that have been integrated with the Identity and User endpoints.

When GET /user is called, the AWS Lambda defined in the module module.user will be executed.

Furthermore, any access a Lambda function requires to an AWS service such Cognito, it will need appropriate permissions to be set. These are defined in their own json files. E.g. for the User service, they're located in ./services/user/policies/lambda.json

The AWS Lambda handler function

Now that the Terraform side of it has been setup, it still needs executable code. Handlers are AWS Lambda entry-points in our Node code. They look like this; (see ./node/src/handlers)

const middy = require('middy')
const { httpErrorHandler, httpSecurityHeaders } = require('middy/middlewares')
const { ErrorResponse } = require('../../lib/response')
const standards = require('../../lib/standards')
const getUser = require('./operations/getUser')

const handler = middy(async (event, context) => {
  const params = standards.getParams(event)
  const operation = standards.getOperationName(event)

  switch (operation) {
    case 'getUser': {
      return getUser.handler(params, operation)
    }

    default: {
      console.error('Unsupported operationName: ' + operation)
      return new ErrorResponse('Unsupported operationName: ' + operation)
    }
  }
}).use(httpErrorHandler()).use(httpSecurityHeaders())

module.exports = { handler }

This is where we can use the JSON schema to validate our REST endpoint input data.

The source file is (re)created by running this node script:

npm run convert

this will take the OpenAPI yml file and convert it to a JSON Schema V4 compatible file.

In turn this is used by the lib ./node/src/lib/jsonSchema, and initialized in a handler function as follows:

const JsonSchema = require('../../../lib/jsonSchema') // our library
const schema = require('../../../schema/example.json') // our JSON Schema
const input = new JsonSchema(schema['/identity/register'].post.body) //the Path in that JSON Schema we want to validate against

const results = input.validateInput(params) // validate the input

if (results.valid) {
  return new SuccessResponse('Success, user with email address: ' + params.email + ' registered successfully')
}

How to setup contract testing with OpenAPI?

We have a working OpenAPI specification, we deployed the solution with Terraform. Now we can also generate the basis of our contract test suite with Postman.

Here's how to do that:

  • Open Postman, go to APIs tab on the left side of the application, hit +New API.
  • Add Schema > Import file, navigate to \openapi-tf-example\services\api\example.yml.
  • Replace in the imported document, ( servers: - url: http://example.com/ ), with the URL that's output at the command line when deployment of the Example REST API has finished, it's called api_url.
    • Recommended, create a domain name and link that to your REST API and have it automatically filled into the OpenAPI yml file with Terraform.
  • Click on Generate Collection, give it a name (e.g. Example) and then add to Contract Test.
  • When you navigate to your Collections (left hand side) you should see it listed there.
  • Go to your Example collection and open identity > register> POST identity register, hit send
  • You should see error messages coming in at the bottom panel;
    • e.g. "message": "should match pattern \"^[_A-Za-z0-9-\+]+(\.[_A-Za-z0-9-]+)@[A-Za-z0-9-]+(\.[A-Za-z0-9]+)(\.[A-Za-z]{2,})$\""
  • If you get the message, that confirms the API is working correctly and you can design your contract tests with documentation under the Tab Test in the APIs tab.

You can use this as the base to create test cases. Look at this article for instance that discusses more in detail how you can create test cases and run them from the Postman CLI with Newman.

What's next?

Alt Text

We covered a lot of ground in this article, let me know in the comments if it was useful and what you'd like to see more of.

Any questions and idea's. Shoot!

Thanks for reading and till next week!

Further reading

Discussion

pic
Editor guide