In my last post I covered creating a lambda & API Gateway to accept a POST request. Now lets save that request to a table in DynamoDB and send it to a queue in SQS for processing.
Creating the Orders Table
We create a new table by updating the template.yml file. Before defining the new resource, lets define a few common properties that we can leverage for all resources. Add a Parameters section to the template under Description, with the following values -
These parameters are used when setting up environment variables for various resources.
Under Resources, we add a resource for the DynamoDB table -
Rather than defining environment variables for each function, we are going to define them under Globals so that they are available to all the lambda functions we create.
We also need to update the properties for the Create function so that it has the required permissions to insert records into the dynamodb table. We add an aws defined policy — AmazonDynamoDBFullAccess
Creating the Orders Queue
We create an SQS queue for the orders by adding another resource to the template.yaml file -
We need to tell the lambda function about the queue, so add another environment variable to the Globals: Function: Environment: Variables:
We also need to give the Create lambda access to add orders to the queue. Add another policy to the Policies list for the function -
The template.yaml should look like this -
DB Helper
Under the src directory, create a new directory called “db” and create the following 4 files under it -
To test database changes locally, we need to spin a local instance of DynamoDB and manually create a table in it.
We can use docker to spin up an instance of DynamoDB. Update docker-compose.yml with the following -
In the terminal, go to the db folder and spin up a dynamodb instance -
cd src/db
docker-compose up -d
The output should look like -
Creating dynamodb-local ... done
To create the Orders table locally, update the init_db.py file with the following code -
Add boto3 to the requirements.txt file and run pip install again -
cd ..
pip install -r requirements.txt
Run init_db.py to create the table on the local dynamodb instance -
python db/init_db.py
If the table is created, you should see the following output -
Table status: ACTIVE
**NOTE: You can use NoSQL Workbench to view the tables in your local dynamodb instance.
Update db_helper.py with the following code -
Putting it all together
Update create.py to send the request to the sqs queue and update the order status in the table -
For this to work in a container, we need to add another line to the Dockerfile -
Testing locally
Build the app locally -
sam build
Since we have specified environment variables in our template, we need to provide those env vars when running the code locally.
Create a new file under the tests folder called “env.json” with the following values -
**NOTE: SQS_QUEUE is empty since we can’t host an sqs queue locally. We will update that value with the deployed queue url once the changes are deployed.
Run the app locally -
sam local start-api --env-vars ./tests/env.json
Test the create endpoint from postman like before -
Since we did not setup the queue locally, an error is expected. We should however, be able to see a new record created in the dynamodb table for the request -
Deploy the app
Run the following command to deploy the app to aws
sam deploy
The output should look like this -
Test the changes by making the POST call to create a request from postman like before -
We can verify the request was saved to the table by going to aws console > DynamoDB > Tables > Explore items > serverless-arch-example-orders
The request status is also set as “Queued” indicating that the request was sent to SQS.
We can validate that as well by going to aws console > Amazon SQS and confirming that the value under Messages available is now 1
You can view the message that was sent by clicking on the queue name > Send and receive messages > Poll for messages
Click on the message ID to view the message
Source Code
Here is the source code for the project created here.
Next: Part 4: Web Scraping with Selenium & AWS Lambda
Top comments (0)