Cache Eviction Policies
When cache is full, which items do you remove? The policy you choose significantly impacts hit rate.
1The Closet Analogy
- LRU: Donate clothes you haven't worn in longest time
- LFU: Donate clothes you wear least frequently
- FIFO: Donate the oldest clothes you bought
- Random: Close eyes and pick randomly
2Eviction Simulator
3Common Eviction Policies
LRU (Least Recently Used)
Remove item that hasn't been accessed for longest time.
How: Track last access time for each item. Evict oldest.
LFU (Least Frequently Used)
Remove item that has been accessed fewest times.
How: Track access count for each item. Evict lowest count.
FIFO (First In, First Out)
Remove item that was added to cache longest ago.
How: Queue structure. Evict from front, add to back.
Random
Remove a randomly selected item.
How: Pick random key and evict it.
TTL (Time-To-Live)
Items expire after fixed time regardless of access.
How: Store expiration timestamp. Remove when expired.
4LRU Implementation
LRU is most commonly asked in interviews. Here's how it works:
Data Structure: Hash Map + Doubly Linked List
class LRUCache:
def __init__(self, capacity):
self.capacity = capacity
self.cache = {} # key → node
self.head = Node() # dummy head (most recent)
self.tail = Node() # dummy tail (least recent)
self.head.next = self.tail
self.tail.prev = self.head
def get(self, key):
if key in self.cache:
node = self.cache[key]
self._move_to_head(node) # Mark as recently used
return node.value
return -1
def put(self, key, value):
if key in self.cache:
node = self.cache[key]
node.value = value
self._move_to_head(node)
else:
if len(self.cache) >= self.capacity:
# Evict LRU (tail.prev)
lru = self.tail.prev
self._remove(lru)
del self.cache[lru.key]
new_node = Node(key, value)
self.cache[key] = new_node
self._add_to_head(new_node)5Comparison Table
| Policy | Best For | Avoid When |
|---|---|---|
| LRU | General purpose, web caches | Scan patterns (read once, never again) |
| LFU | Frequency-based access (CDN) | Access patterns change over time |
| FIFO | Simple caches, streaming data | Data has varying importance |
| TTL | Time-sensitive data, sessions | Access frequency matters more than age |
6Interview Tips
- Use Hash Map + Doubly Linked List for O(1) operations
- Explain why both structures are needed
- Handle edge cases: capacity=0, duplicate puts
- Discuss thread safety if asked (locks, concurrent hash map)
7Key Takeaways
8Interview Follow-up Questions
Interview Follow-up Questions
Common follow-up questions interviewers ask
9Test Your Understanding
Test Your Understanding
5 questions
LRU cache implementation requires which data structures for O(1) operations?
Which eviction policy is best for data with stable popularity over time?
What is a key disadvantage of LFU eviction?
Redis uses which eviction approach?
In an LRU cache, when a key is accessed (get operation), what happens?