Task Statement 3.3: Determine high-performing database solutions.
📘AWS Certified Solutions Architect – (SAA-C03)
What is Caching?
Caching is a way to store frequently used data in a fast storage layer so that applications can access it quickly instead of repeatedly querying slower databases. Think of it as a shortcut for data retrieval.
- Why use caching?
- Reduce latency: Access data faster than reading from the database every time.
- Reduce database load: Fewer database queries mean the database can handle more users.
- Improve scalability: Applications can handle sudden spikes in traffic.
Types of Caching in AWS
AWS provides caching mainly through Amazon ElastiCache. There are two popular caching engines:
- Redis (via ElastiCache for Redis)
- Memcached (via ElastiCache for Memcached)
1. Amazon ElastiCache for Redis
Redis is a key-value in-memory store. It is very fast and supports advanced features.
Key features:
- Persistence: Redis can save data to disk, so it’s not lost if the server restarts.
- Replication: Supports read replicas to distribute read traffic.
- Data structures: Can store strings, hashes, lists, sets, sorted sets, etc.
- Pub/Sub: Can be used for messaging between applications.
- High availability: Supports Redis Cluster and automatic failover.
Use cases in IT:
- Session storage: Keep user session data for web apps.
- Leaderboards: Quickly access sorted data for gaming or analytics.
- Caching database queries: Store results of expensive queries to reduce database load.
2. Amazon ElastiCache for Memcached
Memcached is simpler than Redis. It is also an in-memory key-value store but is volatile (data is lost if server restarts) and does not support persistence.
Key features:
- High-speed caching: Excellent for read-heavy workloads.
- Simple architecture: Easy to set up and scale horizontally.
- Scalability: Can add more nodes to increase cache capacity.
Use cases in IT:
- Object caching: Cache API responses or database query results.
- Temporary storage: Keep temporary data that can be regenerated if lost.
Caching Strategies
- Lazy Loading (On-Demand)
- Data is cached only when it is requested.
- If data isn’t in cache, it’s retrieved from the database and stored in cache.
- Pros: Efficient, only caches what’s needed.
- Cons: First request is slower (cache miss).
- Write-Through
- Every write to the database also updates the cache.
- Pros: Cache is always up to date.
- Cons: Slightly slower writes, more complex.
- Write-Around
- Writes go directly to the database, not the cache.
- Cache is updated only when data is read (lazy loading).
- Pros: Avoids unnecessary caching of data that is rarely read.
- Cons: Initial reads may be slower (cache miss).
- Write-Back (Write-Behind)
- Writes are made to the cache first, and the database is updated later asynchronously.
- Pros: Fast writes, reduces database load.
- Cons: Risk of data loss if cache fails before database update.
ElastiCache Deployment Considerations
When using ElastiCache, consider:
- Cluster mode: Redis can be clustered for scaling horizontally.
- Multi-AZ deployment: For high availability, enable automatic failover.
- Node type and size: Choose based on memory and throughput requirements.
- TTL (Time to Live): Expire cached data to avoid stale data.
- Eviction policies: Define what happens when cache is full (LRU = Least Recently Used, etc.).
AWS Exam Tips
- Know the difference between Redis and Memcached:
- Redis = advanced features, persistent, supports data structures.
- Memcached = simple, volatile, fast, good for horizontal scaling.
- Understand caching strategies: Lazy loading, write-through, write-around, write-back. The exam may give a scenario and ask which strategy fits best.
- Performance impact: Caching is used to reduce database load and improve application speed.
- Integration with other AWS services: ElastiCache is often used with:
- Amazon RDS or DynamoDB for database query caching.
- API Gateway / Lambda for serverless apps needing fast data access.
- High availability and scaling: Know Multi-AZ, replication, and cluster mode for Redis.
Quick Comparison Table for Exam
| Feature | Redis | Memcached |
|---|---|---|
| Persistence | Yes | No |
| Advanced data structures | Yes | No |
| Scaling | Horizontal (cluster) | Horizontal |
| High availability | Multi-AZ & failover | No built-in failover |
| Use case | Session storage, leaderboards, pub/sub | Simple cache for queries |
✅ Summary:
Caching with ElastiCache improves performance by reducing database load and latency. Redis is feature-rich, persistent, and supports advanced use cases. Memcached is simpler, fast, and suitable for temporary caching. Choosing the right caching strategy and deployment design is critical for high-performing database solutions in AWS.
