Let's face the second part of our little project which started here.
In the first part we achieved a solution to invoke the GeoIP lambda in local environment successfully, giving us insights about our location. But this is not completed if we can not execute that function in the cloud, so I am going to give you a fair solution which it not the most optimal one, but at least it could point us into the right direction.
Requisites
Remember that you will need a few things installed and working on your machine before starting:
- Docker installed on your machine
- An AWS active account
- AWS SAM CLI working on your machine
- AWS CLI
Create our lambda function in AWS
There is so much literature out there with high details to create a lambda function in AWS, so I am going to skip a few steps:
- Visita Lambda Dashboard
So, now we are ready to upload our code to this lambda function.
Manual upload
Assuming we are at the root of the project we may execute a simple ZIP command to bundle the entire project zip function.zip * -r
Now we return to the lambda function dashboard and upload this zip
as the code:
After uploading the zip
file an info banner will take your attention:
Do not worry about that right now, let's check that the functionality is correct. In order to do so, we run again the function using the Test button in the upper right corner of the UI:
That error was coming whether we liked or not since we built the test event without taking into account the function logic itself. Now we create a new test event and paste our current event.json built for the local environment:
And now we execute the function again:
It works, but this is not the best way to do it, it is far away from a continuous deployment pipeline. We are going to make one step further and build a command to deploy the code to the cloud without using the
AWS CLI ready for service!
Add a new NPM script to your package.json:
{
...
"deploy": "npm prune --production && zip function.zip * -r && aws lambda update-function-code --function-name geoIPNode --zip-file fileb://function.zip --profile personal && rm function.zip && npm install",
...
}
We are cleaning node_modules
folder to upload only production packages, building the zip
file, updating the lambda function and re-installing packages deleted in the prune process.
You can safely use npm run deploy
to update your lambda whenever you want.
This is neither the best way to do this process, because we can and should improve this process much further, but it will not be in this post. I will leave you some ideas to improve memory and time usage for the function:
Upload GeoIP database (which is the biggest one) to S3 and load it within the code using AWS SDK for JS.
Upload GeoIP database to a EFS (Amazon elastic file system) which can be share through any lambda function.
Both ideas allow you to edit the code in the lambda editor, since the package uploaded will not be bigger than the limit specified.
API Gateway
We just only need an endpoint to attack the lambda function from the Internet. Visit API Gateway dashboard and let's start:
With your API built, you may access the operational URL served by AWS with our parameter string, such as:
The end
I wanted to give you a broader scope in terms of using a simple AWS service, such as Lambda, to build a solution around something useful. I hope you enjoy the content as much I enjoyed writing it for everyone. Here I link the repository used.
Top comments (0)