DEV Community 👩‍💻👨‍💻

Look Ma', No Lambda! Lambdaless APIGateway and DynamoDB integration (with CDK)

Serverless is awesome! On the other hand, it might take some time to get used to the tons of AWS Services out there which can be used to build an application. And it can happen that you end up with an architecture that is unnecessarily complex and costly.

Think of a simple CRUD API to manage a user.

apigateway+lambda+dynamodb

What we see normally in thousands of tutorials is API Gateway + Lambda + DynamoDB. (with the Lambda being a single function for each endpoint / method or a Fat Lambda / Lambdalith handling multiple endpoints/methods). This is also the approach shown in the official - and very useful- git repo AWS CDK Samples.

What if i told you

you might not even need a Lambda function?

Yes, often that Lambda function does nothing else than receiving the payload and (maybe after some validation) stuffing it into dynamo ( in case of a creation or edit ) or simply forwarding the GET or DELETE commands to dynamo. (see again the content of the Lambda in the official sample).

If that is the case, then, yes, you don't need the lambda at all because APIGateway comes with lots of different integrations with other AWS Services (like DynamoDB or SQS) that can help us streamline our application, simplify the infrastructure, the code base and even reduce costs!

There are some tradeoffs, but let's first look at how to for example, create a RESTful API which puts and retrieves items to and from DynamoDB.

AWS Service Integrations

Here you can see how you can set up an API Gateway with Service Proxy Intregration directly from the UI Console, which is nice to get our hands dirty and understand the steps, but we don't want to set up our infrastructure with thousands clicks, do we? No, we want Infrastructure As Code, therefore let's see how to do that using AWS CDK.

First we need to create our Table, as usual.

 const dynamoTable = new Table(this, 'users', {
      partitionKey: {
        name: 'userId',
        type: AttributeType.STRING
      },
      tableName: 'users',
      removalPolicy: isProduction ? RemovalPolicy.RETAIN : RemovalPolicy.DESTROY,
      timeToLiveAttribute: 'ttl',
    });
Enter fullscreen mode Exit fullscreen mode

Then, and again here nothing is different from the solution with the Lambda in place, Create a role for API Gateway and the RestAPI :

const api = new RestApi(this, 'CRUDUserAPI', {
            description: 'API to manage users',
            restApiName: 'usersapi',
        })
const usersResource = api.root.addResource('users')
const userResource = usersResource.addResource('{userId}')
Enter fullscreen mode Exit fullscreen mode

This time, rather than creating the Integration with Lambda and assigning our Lambda the permissions necessary to read and write to dynamo (like
dynamoTable.grantReadWriteData(myLambda)
) we create a policy for our Dynamo Table, we create a Role for our APIGateway and attach that policy to that role which we will then assign to our endpoint - methods.


const dbPolicy = new Policy(this, 'dbPolicy', {
            statements: [
                new PolicyStatement({
                    actions: ['dynamodb:PutItem', 'dynamodb:GetItem', 'dynamodb:DeleteItem'],
                    effect: Effect.ALLOW,
                    resources: [dynamoTable.tableArn],
                }),
            ],
        })

    const apiRole = new Role(this, `${id}APIRole`, {
            roleName: `${id}APIRole`,
            assumedBy: new ServicePrincipal('apigateway.amazonaws.com'),
        })


        apiRole.attachInlinePolicy(dbPolicy)

Enter fullscreen mode Exit fullscreen mode

Something that we need to setup are the MethodOptions responses and their templates. This is especially necessary because in our Lambda we would return the body and the status response code. In case of Dynamo direct integration we need to map the errors or the successful response so that meaningful data is returned to the client.

  const errorResponses = [
            {
                selectionPattern: '400',
                statusCode: '400',
                responseTemplates: { 
                    'application/json': `{
                        "error": "Bad input!",
                    }`,
                },
            },
            {
                selectionPattern: '5\\d{2}',
                statusCode: '500',
                responseTemplates: {
                    'application/json': `{
                        "error": "Internal Service Error!"
                    }`,
                },
            },
        ]
        const integrationResponses = [
            {
                statusCode: '200',
                message: '',
            },
            ...errorResponses,
        ]

Enter fullscreen mode Exit fullscreen mode

Here I am mapping any error code in the 400 range to a simple Bad Input message and all the errors in the range 500 to Internal Service Errors. Of course you can get into more details and customize the response how you prefer.

You can read more about the mapping template reference here

Now we can create a Validator and a Model where we define our JSON Schema - very like in your lambda you would validate the payload of the incoming request somehow - possibily with a middleware like Middy and JSON Schema.

The concept is similar, the JSON schema specifies the properties which are required and optional and how they should look like.

const requestValidator = new RequestValidator(this, "MyPayloadValidator", {
            restApi: api,
            requestValidatorName: `${name}-${env}-payload-validator`,
            validateRequestBody: true,
        })

const payloadModel = new Model(this, "ValidationModel", {
            restApi: api,
            contentType: "application/json",
            description: "json schema to validate api payload",
            schema: {
                type: JsonSchemaType.OBJECT,
                properties: {
                    userId: {
                        type: JsonSchemaType.STRING
                    },
                    userName: {
                        type: JsonSchemaType.STRING
                    },
                    email: {
                        type: JsonSchemaType.STRING
                    },
                    lang: {
                        description: "Language Code / Locale like it_IT or de_DE",
                        type: JsonSchemaType.STRING
                    }
                },
                required: [
                    "userId",
                    "userame",
                    "email"
                ]
            }
        })

Enter fullscreen mode Exit fullscreen mode

Now add the CRUD methods to our endpoints and provide all the pieces we created above.

The GET method to retrieve a user by its id is very straightforward.

        userResource.addMethod(
            'GET',
            new AwsIntegration({
                service: 'dynamodb',
                action: 'GetItem',
                options: {
                    credentialsRole: apiRole, 
                    integrationResponses,
                    requestTemplates: {
                        'application/json': `{
                             "Key": {
                                "userId": {
                                    "S": "$method.request.path.userId"
                                }
                              },
                            "TableName": "${usersTable.tableName}"
                        }`,
                    },
                },
            },
            ),
            {
                methodResponses: [
                    {
                        statusCode: '200',
                        responseModels: {
                            'application/json': Model.EMPTY_MODEL,
                        },
                    },
                    {statusCode: '400'},
                    {statusCode: '500'},
                ],
            },
        )
Enter fullscreen mode Exit fullscreen mode

The most interesting thing here is that instead of Lambda Integration, here we have an AWSIntegration and we are specifying to which AWS Service we are proxying the request and to which command:

new AwsIntegration({
            service: 'dynamodb',
            action: 'PutItem'})
Enter fullscreen mode Exit fullscreen mode

Within the integration options we assign the role we created above credentialsRole: apiRole - without it our Request would fail due to missing permissions to access the table.

Mapping Templates

Another thing to note in the above code is how the mapping template grabs the id from the path parameters $method.request.path.userId ( imagine a request made to myapigateway/users/12345 and assign it to the Key used to search the table.

Mapping templates are very handy because they allow us to map, manipulate and override whatever we received in the request.
For example we can rename property names in case we don't want to expose or use the properties/columns of our database. You will understand more what I mean in the next section.

Let's now see the method in charge of the creation of a user.
(i won't go into the details of how userID is generated - this is out of the scope of this tutorial, so here to simplify I am just concatenating the uniqueId of the request with the playername).

 usersResource.addMethod('POST', new AwsIntegration({
            service: 'dynamodb',
            action: 'PutItem',
            options: {
                credentialsRole: apiRole,
                integrationResponses: [
                    {
                        statusCode: '200',
                        responseTemplates: {
                            'application/json': `{
                "requestId": "$context.requestId"
              }`,
                        },
                    }, ...errorResponses,
                ],
                requestTemplates: {
                    'application/json': 
`#set($ttlEpoch = $context.requestTimeEpoch / 1000 + ${deleteAfterSeconds})
    {
        "Item":
        {
            "userId":
            {
                "S": "$input.path('$context.requestId')-$input.path('$.player_name')"
            },
            "correlationId":
            {
                "S": "$context.requestId"
            },
            "status":
            {
                "S": "to_be_confirmed"
            },
            "userName":
            {
                "S":"$input.path('$.user_name')"
            },
            "email":
            {
                "S": "$input.path('$.email')"
            },
            "locale":
            {
                "S": "$input.path('$.language_code')"
            },
            "ttl":
            {
                "N": "$ttlEpoch"
            }
        },
        "TableName": "${usersTable.tableName}"
    }`,
                },
            },
        }), {
            requestValidator,
            requestModels: {'application/json': payloadModel},
            methodResponses: [{
                statusCode: '200', responseModels: {
                    'application/json': Model.EMPTY_MODEL,
                },
            },
            {statusCode: '400'},
            {statusCode: '500'}],
        })
Enter fullscreen mode Exit fullscreen mode

Wow. this is quite some stuff here. let's try to break it up.

Again here the Response model is necessary to map the response from Dynamo (remember that we don't have Lambda code to do that) in case we want to hide some values or rename some properties. We also want to map the errors that we might receive if something goes wrong.

And now the juicy part: the Request Mapping, where we tell APIGateway to use the Model and Validator, and we specify how to manipulate the payload to rename some properties or create new ones.

This is not strictly necessary, if we have control on the client and we agreed to have a specific payload we can just use that as is, but it might happen that we need to adjust it ( like in my example user_name was coming from an old php application and i want to rename it as name, keep track of the requestId to facilitate tracking the request and correlated requests and operations in different parts of the infrastructure, or eventually taking advantage of the TimeToLive functionality on DynamoDb to delete the item after some time.

#set($ttlEpoch = $context.requestTimeEpoch / 1000 + ${deleteAfterSeconds})
Enter fullscreen mode Exit fullscreen mode

Here for example I am taking the requestTimeEpoch from the context of the Request to generate the timestamp when the item will have to be automatically deleted.

Again this is not really necessary, but whenever I found myself dropping a lambda in favour of a direct integration, then I found myself a bit struggling to manipulate the payloads or adding properties with the VLTs therefore I thought it might be useful to have a couple of samples.

The last method I will show you is the Deletion.

 userResource.addMethod('DELETE', new AwsIntegration({
      service: 'dynamodb',
      action: 'DeleteItem',
      options: {
           credentialsRole: apiRole, 
           requestTemplates: {
              'application/json': `{
                     "TableName": "${usersTable.tableName}",
                     "Key": {
                        "userId": {
                           "S": "$method.request.path.userId"
                            }
                        }
                    }`,
                },
            },
        }),
        {
            methodResponses: [
                {
                    statusCode: '200',
                    responseModels: {
                        'application/json': Model.EMPTY_MODEL,
                    },
                },
                {statusCode: '400'},
                {statusCode: '500'},
            ],
        })

Enter fullscreen mode Exit fullscreen mode

Not much to say here, and I think that by this time the idea is clear and you basically can continue with PATCH method on the User/{id} endpoint and so on.

Pros and Cons

Debugging

If you follow this approach, and I really suggest you doing it, there is only one problem which is testability and debuggability of what is going on.

When you have your logic and validation in the lambda that is proxying the request from API Gateway to Dynamo, you can run it locally, debug it from your IDE, have unit tests on the validation and execution logic.

When you have velocity templates... it is just magic. At least for me, I haven't found a way to test and debug them.
(There is a way of evaluating templates for AppSync, but so far I did not find a way for ApiGateway).

What helps though is activating APIGateway Logs, so that everything going on is logged to CloudWatch:

In the CDK it is just a matter of adding a couple of parameters in your APIGateway deployment options:

deployOptions: {
                stageName: stage,
                loggingLevel: MethodLoggingLevel.INFO,
                dataTraceEnabled: true,
            },
Enter fullscreen mode Exit fullscreen mode

Then in CloudWatch you will be able to see the method request headers, the request body before transformation, the request query string or path parameters, if available, and the request body after transformations ( those applied by your VLT ).
You can see the invocation to your dynamo table and its response.
If you have mappings on the response you can also take a look at those, and understand why something does not work as expected.

This logging though will store a lot of data in CloudWatch so be careful because it could get very expensive. Disabled it after you run some tests and you are sure everything is working as expected.

Otherwise the only way to have a similar result is manually Testing the Gateway API from the console. (that means, deploying your stack then go to your APIGateway in your browser, go the the Integration and Mapping Templates. Edit your template there and the run Test.

Less is more

Since you don't have the Lambda running ( coldstarting or not ) you will have lower latency and lower costs (one service less to be counted in your bill). But is is less code?

That not really... especially if you need to use Velocity Templates to map your requests and response and do data transformation, you still need a lot of code - it will be in your IaC rather than in your deployed Lambda, but it is still code that needs to be written and maintained.

Conclusion

I really love how this approach is lean and simple, but I must say that I found working with VLT was quite hard - cumbersome to implement and hard/impossible to test, (despite the documentation being very extensive) so, like in many circumstances we need to be aware of costs and benefits.

If you need advanced validation, you need to do lots data manipulation in your requests or responses and you are not handling millions of requests every day ( or second ) where latency and costs really would make a difference, I think keeping Lambda in the system could be still a good idea since you (and anyone in your team, no matter how junior) can write the logic in your language of preference (node, java or python) and you can write and run unit tests on it.

On the other hand if your Lambda really does nothing else than validating and proxying requests to Dynamo ( or other service like StepFunctions or SQS), you can keep your MappingTemplates to a minimum (or you must shave cents and milliseconds) this approach is definitely the way to go.

Simple, less expensive, with lower latency and less pieces in the infrastructure that have to be maintained and monitored.

Top comments (0)

🌚 Friends don't let friends browse without dark mode.

Sorry, it's true.