DEV Community

Discussion on: Create your first AWS Billing Alarm

 
rehanvdm profile image
Rehan van der Merwe

Sorry, I guess this is a bit off topic from your topic. The only reason @andrew Brown suggested to put SQS in front of DynamoDB is because they are most likely running their tables with Provisioned Capacity billing mode. With that they are using Auto-scaling. So most of the AWS Service Autoscale the same way, by creating multiple CloudWatch Alarms on a Metric for that service. Then if the current Write Capacity Units (WCU, basically rate at which is being written to DynamoDB) exceeds a value, the alarm triggers and does a certain action. In this case it will increase the Maximum Write Capacity Units so that writes do not get throttled.

So by controlling the rate at which data is written into DynamoDB they are limiting the amount of times that the CloudWatch Alarms triggers autoscaling actions. This reduces their bill as he explains these alarms have accumulated costs to an amount of $30 once. That's a lot, you can basically get a EC2 T3 Medium running an entire month for that price.

Thread Thread
 
rehanvdm profile image
Rehan van der Merwe

So he can only do this because when he writes, he writes to the queue, then returns immediately and does not wait until the data is written into Dynamo. There systems are async and most likely subject to eventual consistency. If you don't design your systems to be async from the start it requires some redesign, that is all I meant.

On the reading side I'm sure you get why a cache helps, but again you need to design that logic in your application. With something like ElasticCache (Redis) you can do a read-trhough cache (and most likely wrtie-through), this logic sits on your application level. Alternatively you can use DAX which is an AWS Managed Cache build specifically for DynamoDB that intercepts the API calls to DynamoDB, both read and write through cache. DynamoDB is a whole can of worms that probably shouldn't be opened on a comment for billing alarms, DM me if you want to know more.

Thread Thread
 
andrewbrown profile image
Andrew Brown šŸ‡ØšŸ‡¦ • Edited

I didn't suggest ElastiCache because it has to be in the same VPC since ElastiCache is a pain as such its likely a much more expensive solution than using DAX (again have yet to compare the pricing) and I wasn't certain how ElastiCache would work with DynamoDB Global Tables which is a common upgrade path for DynamoDB so did I not want to suggest an expensive and technical dead-end solution.

Not arguing just more food for thought and taking this conversation well off course.

Thread Thread
 
rehanvdm profile image
Rehan van der Merwe

Oh I see, sorry about that. From what I can remember, I did a quick comparison when DAX just came out and it was more expensive, specifically for our use case then. But this was back then.

Hehe but if you didn't take it off course I wouldn't have learned something new. Thanks

Thread Thread
 
hzburki profile image
Haseeb Burki

This has been really informative. I have a production-level app deployed on AWS (API, DB, Frontend). I've learned all the services I've had to encounter from google, blogs, and documentation. Unfortunately, I have not had the need to tinker with DynamoDB and a number of other necessary services as well.

I have started a learning path for AWS Solutions Architect certification and will be writing more blogs as a means of keeping notes and sharing what I learn.

Thread Thread
 
medvedevdenis12 profile image
Denis

Hi Andrew, can you recommend a video or article where I can find step by step implementation of that process? I mean put the record to sqs, process it with lambda and save to dynamo db with reducing the cost of db? thank you in advance!