It is no longer a mystery, APIs are eating the world. There are a lot of companies today that are offering their APIs as the primary medium to interact with a system, with the User Interface merely as a byproduct, or, in any case, not the primary product being sold. Companies such as Stripe and Twilio led the vanguard of this movement in 2012.
Given this shift (which, by the way, is still happening), APIs have started to get more and more complicated, and so the API development process needed to evolve as well. APIs have become a product, therefore all the typical team members and methodologies have been employed to make that product successful.
The increase in relevance of API based products requires new tooling to support the people and methodologies used to implement the product. How would you design an API so that you can review and iterate on the design before building it for real? When designing a mobile application, you have tools to easily create a mockup to make sure everybody in your organization is on the same page before committing your resources on creating the real app.
For years, this wasn’t a thing for APIs. Until the OpenAPI Specification (and a bunch of other standards) came out.
In this article, we’ll review what OpenAPI can offer from a security standpoint, how you can embrace these features today, and showcase some companies that are doing really cool stuff in this space!
OpenAPI — formerly known as Swagger — started out as a simple, open source specification for designing RESTful APIs in 2010 and, despite other API Specifications formats that came out during the following years (such as RAML and APIBlueprint), the Swagger project became the most popular one.
In 2015, the Swagger project was donated to the Linux Foundation and renamed to OpenAPI Specification, with Microsoft and IBM joining the foundation to help move the format forward. Their first release was OpenAPI 2.0 ,which is nothing more than the original Swagger format rebranded; and then a couple of years later, OpenAPI 3.0 was released with some important updates.
Today the OpenAPI initiative includes more than 10 companies that recognize the high value and importance of joining efforts to produce a standardized document to describe an API.
OpenAPI 2.0 is nothing more than the good old Swagger format, rebranded after the donation to the Linux Foundation.
OpenAPI 2.0 documents can be written both as a JSON or a YAML file, and they allow you to define how your exposed API looks like in terms of endpoints, accepted and returned payloads, media types, returned status codes, and servers where the API can be reached.
We aren’t going to look at all these parts here. Instead, we will focus on the security features that the specification offers.
OpenAPI 2.0 offers a dedicated section to declare the security features and requirements of your API and then use these where appropriate in your paths and operations.
api_key: type: apiKey name: api_key in: header petstore_auth: type: oauth2 authorizationUrl: http://swagger.io/api/oauth/dialog flow: implicit scopes: write:pets: modify pets in your account read:pets: read your pets
In this example we’ve declared 2 security definitions: the first one will be looking for an API Key in the specified header (api_key in this case), while the second one is declaring an implicit oAuth2 flow, requiring some particular scopes.
OpenAPI 2.0 supports another security definition type called basic , which is fundamentally the old plain HTTP Authentication format.
OpenAPI 2.0 does not have other built-in security definitions and it’s not possible to define custom ones without using vendor-provided extensions. While these three options cover a good chunk of real-world use cases, they might not be enough for special cases.
OpenAPI 3.0 was released in 2017 with the intention of addressing some of the drawbacks and limitations version 2.0 was suffering from.
We are not going to focus on the OpenAPI 3.0 Specification and highlight all the differences between the two formats: there are plenty of links out on the internet detailing the changes. Instead, we are going to focus on the security changes that have been introduced in this new version of the specification.
In the same way OpenAPI 2.0 has a dedicated part of the document to declare security definitions, OpenAPI 3.0 has one too. The difference is that OpenAPI 3.0 has changed the terminology to "security schemes." The spec also standardized the way to declare all the parts of the spec that can be reused across multiple paths. While previously the shared components was something left to the common sense of the developers, now they are all grouped under the components key.
Moreover, OpenID Connect support has been added as well as the ability for a security definition to include multiple oAuth2 flows (which is a common functionality used around, today).
To give an idea of how the Security declaration have changed from OAS2.0 to OAS 3.0, this is the same example we mentioned before, wrote for OAS3:
securitySchemes: api_key: type: apiKey name: api_key in: header petstore_auth: type: oauth2 flows: implicit: authorizationUrl: http://swagger.io/api/oauth/dialog scopes: write:pets: modify pets in your account read:pets: read your pets authorizationCode: authorizationUrl: http://swagger.io/api/oauth/dialog tokenUrl: https://swagger.io/api/oauth/token scopes: read:pets: Grants read access write:pets: Grants write access
You can see that now the oAuth2 definition supports multiple flows, defined by a new key in the security section.
Let’s now go hands-on and try to create a document representing an API that’ll be exposed on the public internet. We will go through the design phase — as well as the implementation code and contract testing for it.
As most of the tooling around OpenAPI is still sticking with the 2.0 version, that is the version we are going to use. However, all the concepts we will be showing here today are still valid.
OpenAPI specification documents are nothing more than YAML or JSON files. Although this is effectively lowering the barrier for new folks willing to write docs (everybody can write a JSON or a YAML document) this hides the complexity of the specification itself. This is where a visual designer or other tools can really be handy.
There are plenty of these on the market; namely Stoplight, Apiary, SwaggerHub. The decision of which one to choose is really up to you and your needs. In any case, all of them offer a free plan so you do not have to pay to get started.
We are going to write nothing more than a simple API that is a trimmed version of the well known example of the PetStore. This API is part of the official examples in both OAS2.0 and 3.0. I’ve chosen this one because of the simplicity of the content itself.
Writing the code, in the world of APIs, is really the most trivial part. Building an API is more a conceptual work and a collaboration effort rather than committing stuff on your repository.
For this reason, we will stick to a very basic example, whose code is hosted on Glitch. For those who do not know it, Glitch is a free hosted service for NodeJS applications. It’s perfect for prototyping and shipping small applications in a very fast way.
Now what we really want to do is keep checking, as you’re developing the server, that the implementation we just wrote is in sync with the specification document we have in our repository. This is where the contract testing tools come into the play; they spin up an instance of your server and, by going through the OpenAPI document, they send HTTP requests to your servers which check the status codes as well as returned payload. If any of these do not match, it will make the CI/CD step fail, and the API will not be deployed. You can also see this as TDD development, where you initially have just an OpenAPI document with all tests failing because you do not have any code.
As you’re developing the API and adding more and more endpoint, you will start seeing more parts of the tests pass until you’re on green (which means you covered the entire API surface declared in the OpenAPI Document).
The setup for these tools is usually straightforward and does not require any particular effort.
The first thing we’re going to do is to download the Prism server on the machine:
curl -L https://github.com/stoplightio/prism/releases/download/v2.0.16/prism_linux_amd64 -o prism && chmod +x ./prism
Once done, we need to start our application server locally and then run Prism providing the listening URL to send the requests to and the OpenAPI document to use as a source of truth:
npm start prism conduct ./sl/tests.scenarios.yml -e host=http://localhost:$PORT
Prism will now read the OpenAPI specification files, go through all the paths and start crafting HTTP Requests following the provided examples or creating payloads using the JSON Schema as a reference and send them to your application server. Once the response comes, Prism will verify the status code as well as that the response’s shape matches the declared one. If any of these do not match, the server will report and error and mark the test as a failure.
Note: these commands should be suitable in a CI/CD environment as well.
Now that we have an API whose design specification matches the implementation, it’s time to deploy it. Most likely we want to protect the API, and Auth0 is clearly one of the choices.
Not all the flows are currently supported by Auth0. Here’s a breakdown of the situation currently:
oAuth2: all the flows are supported and is the recommended way to deal with authentication/authorization in general. It is also possible to emit JSON based tokens instead of opaque ones, enabling other scenarios (such as passing the token around web apps)
openIdConnect: supported by Auth0 with all the standard claims
basic/http: supported as a particular type of oAuth 2.0 flow. Any client that has a client password might send this using the HTTP Basic Authentication Scheme. To be fair, this is rather a way to send the client password rather than full support for the scheme; as you won’t get any prompt on your browser while trying to hit the target path
apiKey: Not supported. This is not formalized in any standard per se — but it is a common way to send a preshared credential on the wire. The nearest authentication method that resembles the presence of an API Key is to use a Client Credential Grant Request.
You can see that, as long you’re using oAuth2.0 or openIdConnect, there’s nothing to worry about. You might get in trouble in case you want to stick with some "older" methods, which can be emulated to a certain degree.
At this stage, we have a backend server implementing our API and an Identity Provider (in this case Auth0) that is storing all our users and providing the necessary infrastructure to authenticate and authorize them in our platform. We are missing the glue between the two pieces: how can an user willing to authenticate in my application get redirected to Auth0’s services?
You can do that in your application code, but in that case it’s no more declarative and a change in your OpenAPI document regarding the security requirements will require a change in your code too.
There’s a possibility to automate this stage with the help of an API Gateway. Given this piece of software is usually configured in an declarative manner and the fact that most of the time your OpenAPI document matches the exposed API, there’s almost a perfect match between the spec and the API Gateway configuration. Unfortunately, as of today, none of the Gateways on the market have reached such level of integration.
For people that are interested in this topic, I gave a presentation last December whose video is published online
We have gone through the security features that are part of OpenAPI 2.0 and 3.0 specifications, and they can help the API users and developers clearly set expectations when consuming and building an API. Then we went a little bit into the API lifecycle and, as you probably noticed, it’s more about communication, being on the same page, and about what you want to expose rather than simply writing the code. That’s just a single step in the whole process. In the end, we explored the opportunities of a runtime integration with an Identity Provider, which is not there yet.