February 16, 2026 · 7 min · 1313 words · Rob Washington
Table of Contents
The fastest request is one you don’t make. Caching trades storage for speed, serving precomputed results instead of recalculating them. But caching done wrong is worse than no caching—stale data, inconsistencies, and debugging nightmares.
-- MySQL query cache (deprecated in 8.0, but concept applies)
SELECTSQL_CACHE*FROMusersWHEREid=123;-- PostgreSQL: use materialized views for expensive aggregations
CREATEMATERIALIZEDVIEWdaily_statsASSELECTdate,COUNT(*)astotalFROMordersGROUPBYdate;REFRESHMATERIALIZEDVIEWdaily_stats;
// Redis with a loader function (conceptual)
constcache=newReadThroughCache({loader:async(key)=>db.findById(key),ttl:300,});// Application just calls get()
constuser=awaitcache.get('user:123');
// When user is updated
asyncfunctionupdateUser(id,data){awaitdb.users.update(id,data);awaitcache.del(`user:${id}`);// Invalidate cache
}// Or publish invalidation event
awaiteventBus.publish('user.updated',{userId:id});
Pros: Cache always fresh after writes.
Cons: Must track all write paths.
// Cache with tags
awaitcache.set('user:123',userData,{tags:['users','user:123']});awaitcache.set('user:123:posts',posts,{tags:['users','user:123','posts']});// Invalidate all entries for a user
awaitcache.invalidateTag('user:123');
constversion=awaitgetSchemaVersion();// or hash of dependencies
constkey=`user:${id}:v${version}`;// Old versions naturally expire, no explicit invalidation needed
asyncfunctiongetWithLock(key){letdata=awaitcache.get(key);if(data)returndata;// Try to acquire lock
constlockKey=`lock:${key}`;constacquired=awaitcache.setnx(lockKey,'1',10);// 10s lock
if(acquired){// We got the lock, refresh cache
data=awaitloadFromDatabase(key);awaitcache.set(key,data,TTL);awaitcache.del(lockKey);}else{// Someone else is refreshing, wait and retry
awaitsleep(100);returngetWithLock(key);}returndata;}
asyncfunctiongetWithEarlyRefresh(key){const{data,ttl}=awaitcache.getWithTTL(key);if(data){// Probabilistically refresh if close to expiration
if(ttl<60&&Math.random()<0.1){// 10% chance to refresh if <60s remaining
refreshInBackground(key);}returndata;}returnloadAndCache(key);}
Don’t let cache misses for non-existent data hammer your database:
1
2
3
4
5
6
7
8
9
10
11
12
13
asyncfunctiongetUser(id){constcached=awaitcache.get(`user:${id}`);if(cached==='NULL_MARKER')returnnull;// Cached non-existence
if(cached)returnJSON.parse(cached);constuser=awaitdb.users.findById(id);if(user){awaitcache.setex(`user:${id}`,300,JSON.stringify(user));}else{awaitcache.setex(`user:${id}`,60,'NULL_MARKER');// Cache the miss
}returnuser;}
When cache is empty (deployment, cache flush), everything hits database:
Solution: Warm the cache before routing traffic:
1
2
3
4
5
6
7
8
9
10
asyncfunctionwarmCache(){constpopularItems=awaitdb.getPopularItems(1000);for(constitemofpopularItems){awaitcache.set(`item:${item.id}`,item);}}// Call during startup, before accepting traffic
awaitwarmCache();server.listen(8080);
// Slow: serialize/deserialize every time
awaitcache.set(key,JSON.stringify(largeObject));constdata=JSON.parse(awaitcache.get(key));// Faster: use binary serialization (msgpack, protobuf)
constmsgpack=require('msgpack-lite');awaitcache.set(key,msgpack.encode(largeObject));constdata=msgpack.decode(awaitcache.get(key));
Hit rate: % of requests served from cache (target: >90%)
Miss rate: % of cache misses
Latency: Cache response time (should be <10ms)
Memory usage: Are you approaching limits?
Eviction rate: Is cache too small?
1
2
3
4
5
6
7
8
9
10
11
// Instrument your cache
asyncfunctionget(key){conststart=Date.now();constvalue=awaitcache.get(key);constduration=Date.now()-start;metrics.histogram('cache.latency',duration);metrics.increment(value?'cache.hit':'cache.miss');returnvalue;}