Skip to content
Low Level Design Mastery Logo
LowLevelDesign Mastery

Design LRU Cache

Design and implement a Least Recently Used (LRU) Cache data structure that supports get and put operations in O(1) time complexity.

Design and implement a Least Recently Used (LRU) Cache data structure that supports get and put operations in O(1) time complexity. The cache should maintain a fixed capacity and evict the least recently used item when the capacity is exceeded. The implementation should combine a HashMap for O(1) lookups and a Doubly Linked List for O(1) insertion and deletion operations, ensuring efficient cache management with thread-safe operations.

This problem is a staple in interviews because it requires deep knowledge of how pointers and hash tables work together to manage memory and access patterns.


Design a cache that stores a limited number of items. When the capacity is reached, it should automatically remove the item that hasn’t been used for the longest time.

Functional Requirements:

  • Get Operation: Retrieve the value associated with a key in O(1) time. Mark the accessed item as most recently used. Return -1 if the key is not found.
  • Put Operation: Insert or update a key-value pair in O(1) time. Mark the item as most recently used.
  • Eviction Logic: When at capacity, evict the least recently used item (the tail of the list) before adding a new item.
  • Fixed Capacity: Maintain a fixed, positive capacity specified during initialization.
  • Order Tracking: Use a Doubly Linked List to maintain access order (Head = Most Recent, Tail = Least Recent).
  • Data Combination: Use a HashMap to map keys to Doubly Linked List nodes for O(1) access.

Non-Functional Requirements:

  • Time Complexity: Both get and put must execute in O(1) time complexity on average.
  • Thread Safety: Ensure the cache works correctly when accessed by multiple threads using appropriate synchronization.
  • Memory Efficiency: Manage memory efficiently by properly linking and unlinking nodes.
  • Modularity: Design the system to be extensible for different eviction policies or integration into larger frameworks.

To achieve $O(1)$ for both operations, you must combine a HashMap (for fast lookup) and a Doubly Linked List (for fast reordering).

Diagram
classDiagram
    class LRUCache {
        -int capacity
        -Map~int, Node~ map
        -Node head
        -Node tail
        +get(key) int
        +put(key, value) void
        -moveToHead(node)
        -removeNode(node)
    }
    
    class Node {
        -int key
        -int value
        -Node prev
        -Node next
    }

    LRUCache o-- Node

Diagram
  • A HashMap gives $O(1)$ lookup but has no concept of order.
  • A LinkedList (specifically Doubly) allows $O(1)$ insertion/deletion if you have the node reference, and it maintains order.
  • Together, the HashMap points directly to the Nodes in the list, allowing you to jump to any node, remove it, and move it to the head in constant time.

Managing the head and tail can be error-prone, especially with 0 or 1 items.

Solution: Use Dummy Head and Tail nodes. By initializing the list with two empty nodes connected to each other, you eliminate null checks and simplify the add and remove logic.

A simple HashMap is not thread-safe. If two threads put at the same time, the pointers in the Linked List could get corrupted.

Solution: Use Synchronization. Wrap the get and put methods in a synchronized block or use a ReentrantLock. Alternatively, for higher performance, use a ConcurrentHashMap combined with fine-grained locking on the nodes, though a global lock is usually sufficient for this interview problem.


By solving this problem, you’ll master:

  • Hybrid Data Structures - Combining primitives for advanced goals.
  • Pointer Management - Handling complex linked structures.
  • Efficiency Optimization - Stripping logic down to $O(1)$.
  • Cache Eviction Policies - Understanding how real-world caches (like Redis or Memcached) work.

Ready to see the full implementation? Open the interactive playground to access:

  • 🎯 Step-by-step guidance through the 8-step LLD approach
  • 📊 Interactive UML builder to visualize your design
  • 💻 Complete Code Solutions in Python, Java, C++, TypeScript, JavaScript, C#
  • 🤖 AI-powered review of your design and code

After mastering LRU Cache, try these similar problems: