Looking up logs in AWS Cloud Console is a tedious work.
CloudWatch interface has its own limitation.
Using shell commands and tools is much more productive.
One of not so known tools is saw.
In this article I will show you some typical use cases when developing and debugging Lambda functions.
By "debugging" I mean the simplest way called also "print line" debugging ;) when you put some
console.log statements in your code, deploy it, execute and watch at the output logs.
Assuming you have already downloaded and installed saw tool (instruction at github), let's prepare environment so that saw can connect to AWS.
Set the following environment variables:
export AWS_ACCESS_KEY_ID=<put your AWS access key here> export AWS_SECRET_ACCESS_KEY=<put your AWS secret key here> export AWS_REGION=<put your AWS region here, e.g.:eu-west-1>
saw watch "/aws/lambda/my-lambda-function"
The above command will fetch logs for Lambda named
my-lambda-function in a similar way like
tail -f filename command works in Linux for printing out lines added to a file in real time.
saw get "/aws/lambda/my-lambda-function" \ --start 2019-09-17T10:52:00.00Z \ --stop 2019-09-17T23:00:00.00Z \ --prefix '2019/09/19/[$LATEST]dc80521928524b82837eae6ee718d217' \ > function.log
The above command will fetch logs from 2019-09-17 starting at 10:52:00 UTC and ending at 23:00:00 UTC and redirect them to file
function.log on your local.
In addition, it will filter out logs from particular log stream (
--prefix arguments are optional, but I would recommend using them unless you want to fetch all the logs for particular function which can be huge amount of data.
Having the logs in a local file is a big advantage as you can further parse it or search with tools like
grep, open in editor etc.
How would you know which log stream to query? Answer to this question you will find in my other article: Tracking failed SQS messages