🎯 LRU is Default
Use LRU for most cases. It’s simple, effective, and handles temporal locality well.
Imagine your cache is a parking lot with 100 spaces. When it’s full and a new car arrives, you need to decide: which car leaves?
That’s cache eviction - deciding what to remove when cache is full.
Maximize cache hit rate - keep the data you’ll access soon, evict data you won’t need.
Most popular policy. Evicts the item that hasn’t been accessed for the longest time.
Think of it like a stack of books on your desk. When you use a book, you put it on top. When your desk is full, you remove books from the bottom (least recently used).
Algorithm:
When to use:
Trade-offs:
This is a classic interview problem. Here’s how to implement it efficiently:
from collections import OrderedDict
class LRUCache: def __init__(self, capacity: int): self.capacity = capacity # OrderedDict maintains insertion order # Most recent at end, least recent at beginning self.cache = OrderedDict()
def get(self, key: int) -> int: if key not in self.cache: return -1
# Move to end (most recent) self.cache.move_to_end(key) return self.cache[key]
def put(self, key: int, value: int) -> None: if key in self.cache: # Update existing self.cache.move_to_end(key) else: # Check if full if len(self.cache) >= self.capacity: # Evict least recent (first item) self.cache.popitem(last=False)
self.cache[key] = value # Ensure it's at end (most recent) self.cache.move_to_end(key)import java.util.LinkedHashMap;import java.util.Map;
class LRUCache { private final int capacity; private final LinkedHashMap<Integer, Integer> cache;
public LRUCache(int capacity) { this.capacity = capacity; // LinkedHashMap with accessOrder=true maintains LRU order this.cache = new LinkedHashMap<Integer, Integer>( capacity, 0.75f, true) { @Override protected boolean removeEldestEntry(Map.Entry eldest) { return size() > capacity; } }; }
public int get(int key) { return cache.getOrDefault(key, -1); }
public void put(int key, int value) { cache.put(key, value); // LinkedHashMap automatically maintains order }}More Efficient Implementation (HashMap + Doubly Linked List for O(1) operations):
class Node: def __init__(self, key=0, value=0): self.key = key self.value = value self.prev = None self.next = None
class LRUCache: def __init__(self, capacity: int): self.capacity = capacity self.cache = {} # key -> node
# Dummy head and tail for easier operations self.head = Node() self.tail = Node() self.head.next = self.tail self.tail.prev = self.head
def _add_node(self, node: Node): # Add after head node.prev = self.head node.next = self.head.next self.head.next.prev = node self.head.next = node
def _remove_node(self, node: Node): node.prev.next = node.next node.next.prev = node.prev
def _move_to_head(self, node: Node): self._remove_node(node) self._add_node(node)
def _pop_tail(self) -> Node: last = self.tail.prev self._remove_node(last) return last
def get(self, key: int) -> int: node = self.cache.get(key) if not node: return -1
self._move_to_head(node) return node.value
def put(self, key: int, value: int): node = self.cache.get(key)
if not node: if len(self.cache) >= self.capacity: tail = self._pop_tail() del self.cache[tail.key]
node = Node(key, value) self.cache[key] = node else: node.value = value
self._move_to_head(node)import java.util.HashMap;import java.util.Map;
class Node { int key, value; Node prev, next;
Node() {} Node(int key, int value) { this.key = key; this.value = value; }}
class LRUCache { private final int capacity; private final Map<Integer, Node> cache; private final Node head, tail;
public LRUCache(int capacity) { this.capacity = capacity; this.cache = new HashMap<>(); this.head = new Node(); this.tail = new Node(); head.next = tail; tail.prev = head; }
private void addNode(Node node) { node.prev = head; node.next = head.next; head.next.prev = node; head.next = node; }
private void removeNode(Node node) { node.prev.next = node.next; node.next.prev = node.prev; }
private void moveToHead(Node node) { removeNode(node); addNode(node); }
private Node popTail() { Node last = tail.prev; removeNode(last); return last; }
public int get(int key) { Node node = cache.get(key); if (node == null) return -1;
moveToHead(node); return node.value; }
public void put(int key, int value) { Node node = cache.get(key);
if (node == null) { if (cache.size() >= capacity) { Node tail = popTail(); cache.remove(tail.key); }
node = new Node(key, value); cache.put(key, node); } else { node.value = value; }
moveToHead(node); }}Frequency-based eviction. Evicts the item with the lowest access count.
Think of it like a popularity contest. Items with more “votes” (accesses) stay longer. When cache is full, the least popular item leaves.
Algorithm:
When to use:
Trade-offs:
Simple queue-based eviction. Evicts the oldest item regardless of usage.
Like a queue at a store - first in, first out. When cache is full, the oldest item leaves, even if it was just accessed.
Algorithm:
When to use:
Trade-offs:
Time-based expiration. Items expire after a fixed time period.
Like milk with an expiration date. After a certain time, items are automatically removed, regardless of usage.
Algorithm:
When to use:
Trade-offs:
| Policy | Complexity | Hit Rate | Use Case | Implementation |
|---|---|---|---|---|
| LRU | Medium | High | General purpose | HashMap + Doubly Linked List |
| LFU | High | High | Frequency-based | HashMap + Frequency tracking |
| FIFO | Low | Low | Simple cases | Queue |
| TTL | Low | Medium | Time-sensitive | Timestamp tracking |
At the code level, eviction policies are strategies you can swap:
from abc import ABC, abstractmethodfrom typing import Any
class EvictionStrategy(ABC): @abstractmethod def evict(self, cache: dict) -> Any: """Return key to evict""" pass
class LRUStrategy(EvictionStrategy): def evict(self, cache: dict) -> Any: # Return least recently used (first in OrderedDict) return next(iter(cache))
class LFUStrategy(EvictionStrategy): def __init__(self): self.access_counts = {}
def evict(self, cache: dict) -> Any: # Return least frequently used return min(self.access_counts.items(), key=lambda x: x[1])[0]
class Cache: def __init__(self, capacity: int, strategy: EvictionStrategy): self.capacity = capacity self.strategy = strategy self.cache = {}
def evict_if_needed(self): if len(self.cache) >= self.capacity: key = self.strategy.evict(self.cache) del self.cache[key]import java.util.Map;
interface EvictionStrategy<K> { K evict(Map<K, ?> cache);}
class LRUStrategy<K> implements EvictionStrategy<K> { public K evict(Map<K, ?> cache) { // Return first key (least recent) return cache.keySet().iterator().next(); }}
class LFUStrategy<K> implements EvictionStrategy<K> { private final Map<K, Integer> accessCounts = new HashMap<>();
public K evict(Map<K, ?> cache) { // Return key with minimum access count return accessCounts.entrySet().stream() .min(Map.Entry.comparingByValue()) .map(Map.Entry::getKey) .orElse(null); }}
class Cache<K, V> { private final int capacity; private final EvictionStrategy<K> strategy; private final Map<K, V> cache;
public Cache(int capacity, EvictionStrategy<K> strategy) { this.capacity = capacity; this.strategy = strategy; this.cache = new HashMap<>(); }
private void evictIfNeeded() { if (cache.size() >= capacity) { K key = strategy.evict(cache); cache.remove(key); } }}Most production systems use combinations:
Pre-populate cache with likely-to-be-accessed data:
Track these metrics:
🎯 LRU is Default
Use LRU for most cases. It’s simple, effective, and handles temporal locality well.
📊 Frequency vs Recency
LRU = recency, LFU = frequency. Choose based on your access patterns.
⚡ O(1) Operations
Proper LRU implementation uses HashMap + Doubly Linked List for O(1) get/put.
🔄 Strategy Pattern
Make eviction policy swappable using strategy pattern in your code.