🎯 Cache-Aside is King
Most applications use cache-aside. It’s flexible, understandable, and gives you control.
Imagine you’re a librarian. Every time someone asks for a book, you could walk to the massive warehouse (database) to find it. Or, you could keep the 100 most popular books on a cart right next to you (cache). When someone asks for a popular book, you grab it instantly. That’s caching.
Caching is storing frequently accessed data in fast storage (usually memory) to avoid slow operations like database queries or external API calls.
| Problem | How Caching Solves It |
|---|---|
| Slow Database Queries | Cache stores results, avoiding repeated queries |
| High Database Load | Reduces database requests by 90%+ |
| Expensive External APIs | Cache API responses, avoid rate limits |
| Repeated Computations | Cache expensive calculation results |
| Geographic Latency | Cache data closer to users (CDN) |
There are four main ways to integrate caching into your application. Each has different trade-offs.
The most common pattern. Your application manages the cache directly.
How it works:
When to use:
Trade-offs:
The cache acts as a proxy. Your application only talks to the cache; the cache handles database access.
How it works:
When to use:
Trade-offs:
Writes go to both cache and database simultaneously. Ensures they stay in sync.
How it works:
When to use:
Trade-offs:
Write to cache immediately, database write happens later. Fastest writes, but risky.
How it works:
When to use:
Trade-offs:
At the code level, caching patterns translate to decorator patterns and repository abstractions.
The decorator pattern is perfect for adding caching to existing repositories:
from functools import wrapsfrom typing import Callable, Anyimport time
class CacheDecorator: def __init__(self, cache: dict, ttl: int = 300): self.cache = cache self.ttl = ttl # Time to live in seconds
def __call__(self, func: Callable) -> Callable: @wraps(func) def wrapper(*args, **kwargs): # Create cache key from function args cache_key = f"{func.__name__}:{args}:{kwargs}"
# Check cache (cache-aside pattern) if cache_key in self.cache: cached_data, timestamp = self.cache[cache_key] if time.time() - timestamp < self.ttl: return cached_data
# Cache miss - fetch from source result = func(*args, **kwargs)
# Store in cache self.cache[cache_key] = (result, time.time()) return result
return wrapper
# Usagecache = {}@CacheDecorator(cache, ttl=300)def get_user(user_id: int): # Simulate database query return {"id": user_id, "name": "John"}import java.util.Map;import java.util.concurrent.ConcurrentHashMap;import java.util.function.Function;
public class CacheDecorator<T, R> { private final Map<String, CacheEntry<R>> cache; private final long ttlMillis;
public CacheDecorator(long ttlSeconds) { this.cache = new ConcurrentHashMap<>(); this.ttlMillis = ttlSeconds * 1000; }
public R apply(String key, Function<T, R> function, T input) { // Check cache (cache-aside pattern) CacheEntry<R> entry = cache.get(key); if (entry != null && !entry.isExpired()) { return entry.value; }
// Cache miss - fetch from source R result = function.apply(input);
// Store in cache cache.put(key, new CacheEntry<>(result, System.currentTimeMillis())); return result; }
private static class CacheEntry<R> { final R value; final long timestamp;
CacheEntry(R value, long timestamp) { this.value = value; this.timestamp = timestamp; }
boolean isExpired() { return System.currentTimeMillis() - timestamp > CacheDecorator.this.ttlMillis; } }}A more complete example showing cache-aside in a repository:
from abc import ABC, abstractmethodfrom typing import Optional
class UserRepository(ABC): @abstractmethod def get_user(self, user_id: int) -> Optional[dict]: pass
class DatabaseUserRepository(UserRepository): def get_user(self, user_id: int) -> Optional[dict]: # Simulate database query return {"id": user_id, "name": "John"}
class CachedUserRepository(UserRepository): def __init__(self, db_repo: UserRepository, cache: dict): self.db_repo = db_repo self.cache = cache
def get_user(self, user_id: int) -> Optional[dict]: # Cache-aside pattern cache_key = f"user:{user_id}"
# Check cache first if cache_key in self.cache: return self.cache[cache_key]
# Cache miss - fetch from DB user = self.db_repo.get_user(user_id)
# Store in cache if user: self.cache[cache_key] = user
return userimport java.util.Optional;import java.util.Map;
public interface UserRepository { Optional<User> getUser(int userId);}
class DatabaseUserRepository implements UserRepository { public Optional<User> getUser(int userId) { // Simulate database query return Optional.of(new User(userId, "John")); }}
class CachedUserRepository implements UserRepository { private final UserRepository dbRepo; private final Map<String, User> cache;
public CachedUserRepository(UserRepository dbRepo, Map<String, User> cache) { this.dbRepo = dbRepo; this.cache = cache; }
public Optional<User> getUser(int userId) { // Cache-aside pattern String cacheKey = "user:" + userId;
// Check cache first if (cache.containsKey(cacheKey)) { return Optional.of(cache.get(cacheKey)); }
// Cache miss - fetch from DB Optional<User> user = dbRepo.getUser(userId);
// Store in cache user.ifPresent(u -> cache.put(cacheKey, u));
return user; }}| Pattern | Read Latency | Write Latency | Consistency | Complexity | Use Case |
|---|---|---|---|---|---|
| Cache-Aside | Low (cache hit) | Low | Eventual | Medium | Most applications |
| Read-Through | Low (cache hit) | Low | Eventual | Low | Read-heavy apps |
| Write-Through | Low (cache hit) | High (waits for DB) | Strong | Medium | Critical data |
| Write-Behind | Low (cache hit) | Very Low | Eventual | High | High write volume |
🎯 Cache-Aside is King
Most applications use cache-aside. It’s flexible, understandable, and gives you control.
⚡ Speed Matters
Cache lookups are 100x faster than database queries. At scale, this difference is massive.
🔄 Consistency Trade-offs
Faster writes (write-behind) = weaker consistency. Stronger consistency (write-through) = slower writes.
🏗️ Decorator Pattern
Use decorator pattern in code to add caching transparently to existing repositories.