DEV Community

Cover image for Maximizing Speed, Costs, UX - AWS ElastiCache Serverless

Maximizing Speed, Costs, UX - AWS ElastiCache Serverless

System performance is directly correlated with revenue. That is a mantra if you don't know it yet. If you have any doubts, you can refer to a few examples. Walmart found that for every 1-second improvement in page load time, conversion increased by 2%. COOK found that by reducing page load time by 0.85 seconds, results in a 7% conversion increase. Mobify found that each 100 ms improvement in their homepage's load time resulted in a 1.11% increase in conversion. Finally, a study done by HubSpot found that even a few milliseconds can significantly impact users' experience (UX), conversion rates, and ultimately revenue. There you go; I'm sure you got the idea now

This is post is based on the presentation I did for DevOps 2024. If you don't feel like reading, you can watch the video here.

Just to be clear, there is always an effort to increase performance, and this is not a new problem. For decades, developers have tried many things, and one thing that worked like magic was caching the frequently accessed data. Caching is a mechanism, whether hardware or software, that stores frequently accessed data for faster retrieval compared to the original source, typically databases, resulting in high performance and low-latency access. Well, this solution has pros and cons like anything in life

Primary advantages :

  • Low Latency: Enables real-time responses.
  • High Throughput: Supports a significant volume of data processing.
  • High Scalability: Easily scales to handle increasing workloads.

Potential disadvantages :

  • Data Consistency: Caching poses challenges in maintaining consistency between cached and original data, leading to potential errors if updates occur in the backend.
  • Cache Invalidation: Determining when to refresh or invalidate cached data is critical to prevent outdated information, requiring effective strategies to sync with changes in the source.
  • Resource Overhead: Managing caches introduces additional resource overhead, potentially increasing storage costs and resource contention, impacting overall system performance.

AWS in memory cache options

AWS has already provided great options to support in-memory caches, and you have most probably been using them to speed up your systems. It's already widely used across the industry. The three most famous AWS in-memory cache options are:

- ElasticCache for MemCachedd - Is simple, non-persistent caching
- ElasticCache for Redis - Adds persistence, replication, and more capabilities
- MemoryDB for Redis - Optimizes for ultra low sub-millisecond latency applications

There are challenges in Server-Based In-Memory Implementations.

While we have been using and are happy with in-memory cache options, there are a lot of challenges developers have been facing,

  • Managing Capacity: Capacity management in traditional server-based in-memory implementations relies on peak points, causing performance impacts during spikes—an inherent challenge.
  • Cost Overhead: Implementations may suffer from either over-provisioning or under-provisioning, leading to cost inefficiencies.
  • Scaling Complexity: Scaling traditional in-memory databases requires intervention and careful capacity planning, introducing complexity
  • Infrastructure Management Burden: Managing servers for in-memory databases involves significant operational tasks, including provisioning, patching, and monitoring.
  • Operational Overhead: Operational tasks in traditional in-memory databases can be time-consuming, diverting focus from development efforts.
  • Development Slowdown: Initial setup and ongoing maintenance efforts in traditional approaches may impede development speed.
  • Manual High Availability Setup: Ensuring high availability in traditional in-memory databases necessitates manual implementation of redundancy and failover mechanisms.

Challenges in Capacity Management

Challenges in capacity management entail that getting it right is difficult. Either you will be under-provisioned, and during spikes, you will see those go beyond your threshold, impacting end-user experience. Alternatively, you will be over-provisioned, resulting in a lot of unused resources and unnecessary costs. Remember, under-provisioned leads to performance impact, and over-provisioned results in excess cost. Most probably, you will not be able to get it right.

Now there is a way around this cat-and-mouse problem

Amazon ElastiCache Serverless

Yes, you now have the Serverless option, where you don't have to worry about provisioning capacity, and AWS will take care of it. Naturally, you will get all the benefits coming via serverless with ElastiCache Serverless. Many other benefits include,

Rapid Deployment

AWS ElastiCache allows users to effortlessly create a cache environment in under a minute, streamlining the process of implementing caching solutions and reducing time-to-market for applications.

Automatic Capacity Management

With AWS ElastiCache, users are relieved from the intricacies of capacity management, as the service dynamically adjusts to accommodate varying workloads, ensuring optimal performance without the need for manual intervention.

Consistently Low Latency

Leveraging AWS ElastiCache provides users with remarkable latency performance, achieving an impressive 700 microseconds at p50 and 1.3 milliseconds at p99, ensuring that applications deliver responsive and timely experiences to end-users.

Scalable Storage Capacity

AWS ElastiCache offers scalability in storage, allowing users to allocate up to 5 terabytes of storage space, ensuring ample capacity to accommodate growing data requirements and expanding workloads.

Cost-Efficient Pay-Per-Use Model

Operating on a pay-per-use model, AWS ElastiCache ensures cost efficiency by allowing users to pay only for the resources consumed, making it a financially prudent choice for organizations seeking flexibility in managing their caching costs.

High Availability Assurance

AWS ElastiCache boasts a robust 99.99% availability Service Level Agreement (SLA), assuring users of a highly reliable caching solution that minimizes downtime and ensures uninterrupted access to cached data.

Simplified Endpoint Management

Users benefit from a streamlined experience with AWS ElastiCache, as it provides a single endpoint for caching operations, simplifying the management and interaction with the caching layer in the application architecture.

Compliance and Security Standards

AWS ElastiCache adheres to stringent security standards, being PCI-DSS, SOC compliant, and HIPAA eligible, providing users with a secure and regulatory-compliant caching solution for handling sensitive data in various industries.

Pricing for ElastiCache

AWS has introduced a simple pricing model based on data stored and something called ElasticCache Processing Units (ECPUs). Let's go through it in a little more detail,

Data stored: Pay for ElastiCache Serverless based on data stored, measured in gigabyte-hours (GB-hrs). Continuous monitoring calculates hourly averages, and each cache is metered for a minimum of 1 GB.

ElastiCache Processing Units (ECPUs): Pay for requests in ECPUs, covering vCPU time and data transfer. Each read or write consumes 1 ECPU per kilobyte (KB) transferred. Additional vCPU time or data transfer over 1 KB scales ECPUs proportionally.

Time to get hands dirty - Getting started with AWS Elastic Cache.

  • Navigate to Amazon ElastiCache: Access the AWS Management Console and locate the Amazon ElastiCache service.
  • Choose Memcached and Create Cache: Within the ElastiCache dashboard, select the Memcached option to create a Memcached cache cluster.

AWS ElastiCache Serverless Step 1

  • Select Serverless Configuration: Choose the Serverless configuration option, allowing AWS to dynamically manage capacity based on the application's workload. This eliminates the need for manual capacity planning and adjustments.

AWS ElastiCache Serverless Step 2

  • Provide a Descriptive Name: Assign a meaningful and descriptive name to your Memcached cache cluster. This name serves as an identifier for easy management and tracking within your AWS environment.
  • Initiate Cache Creation: Once the configuration is complete, proceed to create the Memcached cache cluster. You can initiate the creation process by selecting the "Create" button.
  • Monitor Cache Creation Progress: AWS ElastiCache will start creating the Memcached cache cluster. You can monitor the progress directly within the AWS Management Console. The creation process typically takes less than a minute, offering a quick and efficient setup.

AWS ElastiCache Serverless Step 3

  • Access and Utilize the Cache: Once the Memcached cache cluster is successfully created, you can access it using the assigned name and begin utilizing it in your applications. Leverage the provided endpoint to connect your applications to the caching layer, enhancing performance and reducing latency

Finaly let's access what we have crated.

AWS ElastiCache Serverless Step 4

Let's dive into a little detail about how we can access the cache. It's as simple as a few command

Command Description
/usr/bin/openssls_client Initiates a secure connection to the MemCached server using OpenSSL.
connect <MemCached end point> Specifies the endpoint and port (11212) for the MemCached server.
-crlf Adds a carriage return and line feed when initiating the connection.
set product_id 0 0 9 Sets the variable product_id with an expiration time of 0 (no expiration) and a data size of 9 bytes.
AERD10001 Assigns the value "AERD10001" to the variable product_id.
STORED Indicates that the data was successfully stored in the MemCached server.
get product_id Retrieves the value of the variable product_id.
VALUE product 0 5 Indicates that the variable product_id has a data size of 5 bytes and starts the output of the variable's value.
AERD10001 Displays the value assigned to the variable product_id, in this case, "hello".
END Marks the end of the response.

Finally, let's go through a few Anti-Patterns to Avoid:

AWS ElastiCache Serverless Anti-Patterns to Avoid

Before wrapping up, let's go through some best practices you should follow.

AWS Serverless Best Practices to Follow

That's a wrap. Enjoy your journey into the Serverless world of ElastiCache.

Need further details, check out - https://aws.amazon.com/pm/elasticache/

Top comments (2)

Some comments may only be visible to logged-in visitors. Sign in to view all comments.