Caching Strategies Simulator - Learn Cache Eviction Policies & Write Strategies
Eviction Policy
LRU: Removes the item accessed longest ago
Cache Simulator
Tap to request data:
Green ring = in cache
Press 1-6 to request, R to reset
Your App
-
-
-
-
Cache (1-10ms)
Database(50-200ms)
Database50-200ms
-
-
-
-
Cache (1-10ms)
Your App
Request Log
$ Click a data item to see how caching works...Tap an item above...
Stats
0
Hits
0
Misses
0%
Hit Rate
0ms
Total Time
Cache (0/4)
Cache is empty
Why Use Caching?
Without Cache
Every request goes to the database (50-200ms). Slow!Every request = slow DB call
With Cache
Repeated requests return instantly (1-10ms). 20x faster!Repeat requests = instant!
The eviction policy decides what to remove when cache is full.
Understanding Caching Strategies
Eviction Policies
LRU (Least Recently Used): Evicts items not accessed recently. Most common in practice.
LFU (Least Frequently Used): Evicts items accessed least often. Great for identifying hot data.
FIFO (First In, First Out): Simple queue-based approach, evicts oldest items first.
TTL (Time To Live): Evicts items based on expiration time. Common for sessions.
Write Strategies
Write-Through: Writes to cache and database simultaneously. Strong consistency but higher latency.
Write-Back: Writes to cache first, async to database. Better performance but risk of data loss.
Write-Around: Writes directly to database, bypasses cache. Reduces cache pollution.
💡 Key Concepts
- • Hit Rate: Percentage of requests served from cache (higher is better)
- • Cache Size: Balance between memory usage and hit rate
- • Hot Data: Frequently accessed items that benefit most from caching
- • Cache Invalidation: One of the hardest problems in computer science