Skip to content
Low Level Design Mastery Logo
LowLevelDesign Mastery

Design a Cache Manager

Design a high-performance caching engine with pluggable eviction policies and TTL.

Design and implement an in-memory cache manager system that supports multiple eviction policies (LRU/LFU), time-to-live (TTL) expiration for cache entries, and thread-safe operations. The system should efficiently manage cache capacity, handle concurrent access, and provide flexible eviction strategies for different use cases.

In this problem, you’ll design a generic, extensible framework where rules like LRU or LFU can be swapped at runtime, and entries can expire independently based on TTL.


Design a production-grade, in-memory cache that efficiently manages its capacity using swappable eviction policies and handles automatic cleanup of expired data.

Functional Requirements:

  • Basic Ops: Support get(key), put(key, value, ttl), and delete(key).
  • Multiple Policies: Implement both Least Recently Used (LRU) and Least Frequently Used (LFU).
  • TTL Support: Support optional expiration times per entry.
  • Capacity Management: Automatically evict items when the configurable limit is reached.
  • Invalidation Events: Allow observers to listen for eviction or expiration events.

Non-Functional Requirements:

  • Constant Time: All operations should aim for average $O(1)$ complexity.
  • Thread Safety: Support massive concurrent access without data corruption.
  • Scalability: The design should support multiple independent cache instances.
  • Memory Management: Ensure expired items don’t leak memory.

The system uses a CacheManager to coordinate between the Storage, EvictionStrategy, and CleanupService.

Diagram
classDiagram
    class CacheManager {
        -Map~K, CacheEntry~ store
        -EvictionStrategy strategy
        -List~Observer~ observers
        +get(key)
        +put(key, value, ttl)
    }
    
    class EvictionStrategy {
        <<interface>>
        +onAccess(key)
        +onUpdate(key)
        +evict() K
    }
    
    class LRUStrategy {
        -DoublyLinkedList list
        +evict() K
    }

    class CacheEntry {
        -V value
        -long expiryTime
        -long frequency
        +isExpired() bool
    }

    CacheManager --> EvictionStrategy
    CacheManager "1" *-- "many" CacheEntry
    EvictionStrategy <|-- LRUStrategy

Diagram

LRU needs a Doubly Linked List to track access order, while LFU needs a frequency map or priority queue. Implementing both in the same class leads to a “God Object.”

Solution: Use the Strategy Pattern. The CacheManager doesn’t know how eviction happens; it just tells the strategy onAccess(key) or onUpdate(key). This allows you to plug in an LRUStrategy or LFUStrategy at initialization.

Checking for expired items on every get call is “Lazy Deletion,” but it doesn’t clean up items that are never accessed.

Solution: Hybrid Cleanup. Use Lazy Deletion for accuracy during get operations, but also implement a background Scheduled Executor that periodically sweeps the ExpiryMap to prune stale entries and free up memory.

A global lock on the CacheManager will throttle performance in multi-core environments.

Solution: Use Striped Locking (similar to ConcurrentHashMap) or Read-Write Locks. Since cache reads usually vastly outnumber writes, a ReentrantReadWriteLock allows multiple threads to get simultaneously while ensuring exclusive access for put or evict.


By solving this problem, you’ll master:

  • Strategy Pattern - Designing for algorithmic flexibility.
  • Concurrency Paradigms - Optimizing for read-heavy workloads.
  • Resource Management - Handling automatic memory reclamation.
  • Decoupled Architecture - Using Observers to broadcast system events.

Ready to see the full implementation? Open the interactive playground to access:

  • 🎯 Step-by-step guidance through the 8-step LLD approach
  • 📊 Interactive UML builder to visualize your design
  • 💻 Complete Code Solutions in Python, Java, C++, TypeScript, JavaScript, C#
  • 🤖 AI-powered review of your design and code

After mastering the Cache Manager, try these similar problems:

  • LRU Cache - A deep dive into the specific LRU structure.
  • Hash Map - Building the underlying data storage.
  • Rate Limiter - Using time-windows and counters for flow control.