Task Statement 3.3: Determine high-performing database solutions.
📘AWS Certified Solutions Architect – (SAA-C03)
1. What is Caching?
Caching means storing frequently accessed data in a fast storage layer so that future requests can be served quickly without repeatedly accessing the main database.
Key Idea:
- Cache = temporary, fast-access storage
- Reduces:
- Database load
- Response time
- Cost (fewer database reads)
2. Why Caching is Important
Benefits:
- Improves performance
- Data is retrieved faster from memory than disk-based databases
- Reduces latency
- Especially important for user-facing applications
- Reduces backend load
- Fewer queries sent to databases
- Improves scalability
- System can handle more requests
- Cost optimization
- Less database usage = lower cost
3. Types of Caching in AWS
You should understand different caching layers:
A. Application-Level Caching
Caching is implemented inside the application logic.
Example:
- Store frequently accessed query results in memory
- Use libraries like Redis clients
Characteristics:
- Full control over caching logic
- Requires developer effort
B. Database Caching (In-Memory Caching)
AWS provides managed caching services:
1. Amazon ElastiCache
A fully managed in-memory cache service supporting:
- Redis
- Memcached
Features:
- Microsecond latency
- Highly scalable
- Fully managed (patching, backups, monitoring)
2. Amazon DynamoDB Accelerator (DAX)
- Purpose-built cache for DynamoDB
- Provides fast reads (microseconds)
Features:
- Fully managed
- No application changes required (minimal)
- Write-through caching
C. Content Delivery Caching
Amazon CloudFront
- Caches content at edge locations
- Used for:
- Static files (images, JS, CSS)
- API responses
Benefits:
- Reduces latency globally
- Reduces load on origin servers
4. ElastiCache Deep Dive
A. Redis vs Memcached
| Feature | Redis | Memcached |
|---|---|---|
| Data Types | Advanced (strings, lists, sets) | Simple key-value |
| Persistence | Yes | No |
| Replication | Yes | No |
| Scaling | Vertical + Horizontal | Horizontal only |
| Use Case | Complex caching | Simple caching |
B. Redis Use Cases
- Session storage
- Leaderboards
- Real-time analytics
- Caching database queries
C. Memcached Use Cases
- Simple key-value caching
- Stateless applications
- High-speed read caching
5. Caching Strategies (Very Important for Exam)
A. Cache-Aside (Lazy Loading)
How it works:
- Application checks cache
- If data exists → return it
- If not:
- Fetch from database
- Store in cache
- Return data
Advantages:
- Only caches needed data
- Simple to implement
Disadvantages:
- First request is slow (cache miss)
B. Write-Through
How it works:
- Data written to cache
- Cache updates database
Advantages:
- Cache always up-to-date
Disadvantages:
- Writes are slower
C. Write-Behind (Write-Back)
How it works:
- Write to cache first
- Cache updates database later (async)
Advantages:
- High write performance
Disadvantages:
- Risk of data loss
D. Read-Through
- Cache automatically fetches data from database
- Application interacts only with cache
6. Cache Invalidation (Critical Concept)
Caching problems often come from stale data.
Methods:
A. Time-to-Live (TTL)
- Cache expires after a fixed time
B. Manual Invalidation
- Application removes/updates cache when data changes
C. Event-Based Invalidation
- Cache updated when database changes
7. Choosing the Right Caching Solution
Use ElastiCache when:
- You need low-latency access to frequently used data
- You want to reduce database load
- You need advanced data structures (Redis)
Use DAX when:
- You are using DynamoDB
- You need faster read performance
- Minimal code changes required
Use CloudFront when:
- You want to cache:
- Static content
- API responses globally
8. When to Use Caching (Exam Focus)
Use caching when:
- High read-heavy workloads
- Frequent repeated queries
- Performance is critical
- Database is a bottleneck
9. When NOT to Use Caching
Avoid caching when:
- Data changes very frequently
- Strong consistency is required
- Data must always be real-time
10. High Availability and Scaling
ElastiCache Redis:
- Multi-AZ with replication
- Automatic failover
Memcached:
- No replication
- Must handle failures manually
CloudFront:
- Global edge locations
- Highly available by design
11. Security Considerations
- Use VPC for ElastiCache
- Enable encryption (in transit & at rest)
- Use IAM and security groups
- Restrict access to cache clusters
12. Monitoring and Optimization
Use:
- CloudWatch metrics:
- Cache hits
- Cache misses
- Latency
Important Metric:
- Cache Hit Ratio
- High = good performance
- Low = ineffective caching
13. Common Exam Scenarios
Scenario 1:
Application is slow due to repeated database queries
→ Use ElastiCache
Scenario 2:
DynamoDB performance needs improvement
→ Use DAX
Scenario 3:
Global users accessing static content
→ Use CloudFront
Scenario 4:
Need session storage for web app
→ Use Redis (ElastiCache)
14. Key Exam Tips
- ElastiCache = general caching (Redis/Memcached)
- DAX = only for DynamoDB
- CloudFront = edge caching
- TTL helps avoid stale data
- Cache-aside is most common
- Redis supports replication, Memcached does not
Final Summary
Caching is essential for designing high-performance, scalable, and cost-efficient architectures in AWS.
To pass the exam, remember:
- Different types of caching layers
- AWS services used for caching
- Caching strategies
- When to use or avoid caching
- Trade-offs (performance vs consistency)
