Recency vs Frequency: LRU, LFU, and Segmented Designs
LRU: Betting on Recency
LRU (Least Recently Used) evicts the item accessed longest ago, betting recent access predicts near term reuse. Adapts quickly to workload shifts: a few accesses immediately promote an item to the most protected position. Implementation requires hashmap plus doubly linked list, consuming 24-48 bytes/entry for pointers and bookkeeping. Critical vulnerability: a sequential scan touching a large keyset once evicts the entire working set, collapsing hit rate from 80% to near zero during the scan.
LFU: Betting on Long Term Popularity
LFU (Least Frequently Used) evicts the item with lowest access count, betting long term popularity predicts future access. Retains popular items through temporary access pauses, making it robust for stable hot sets. Without counter decay (periodically reducing all counts), LFU adapts slowly: stale once popular items retain high counts indefinitely and resist eviction even after patterns change. Uses 1 byte/entry counters with periodic halving to prevent saturation.
Segmented LRU
SLRU (Segmented LRU, also called 2Q) combines recency and frequency to resist scan pollution while adapting to change. Two LRU queues: probation (~20% capacity) for newly admitted items, protected (~80%) for items with a second hit. Items enter probation on first access; a second hit promotes to protected. Eviction takes from probation first, so scans flow through without disturbing the protected working set.
CLOCK Algorithm
CLOCK approximates LRU using reference bits and circular buffer with a sweeping hand. Each entry has a 1 bit flag set on access. On eviction, sweep clockwise: if bit=1, clear and advance; if bit=0, evict that entry. Lower overhead than linked list reordering and better concurrency. Tune sweep rate relative to insertion: too slow degenerates to FIFO (First In First Out); too fast wastes CPU scanning.