🚀 Caching — The Secret Weapon Behind Fast, Scalable Systems
🧐 What is Caching?
Caching means storing frequently accessed or expensive-to-compute data in a faster medium (in-memory, browser, CDN, Redis) so future requests are served instantly.
How it works:
- Check cache → ✔️ If data is found → Cache Hit → return instantly ❌ If not found → Cache Miss → fetch from DB/API → store → return
Caching = trading RAM for speed.
🤔 Where Data Is Cached? (Cache Layers)
1. Application Cache (In your code)
- Memoization
- Function-level caching
- Example: caching expensive calculations
2. Server-Side / Distributed Cache
- Redis, Memcached
- For sharing cache across multiple backend instances
3. Database Cache
- Query caching
- Materialized views
- Example: PostgreSQL caching recent SELECT queries
4. Browser Cache
- Uses HTTP headers
- localStorage, sessionStorage, IndexedDB
5. CDN Cache (Global Edge Servers)
- Cloudflare, CloudFront
- Caches images, CSS, JS, static HTML at edge
🤨 Why Do We Use Caching? (The Real Benefits)
⚡️ Performance Boost
Database query: ~100–300ms
Redis fetch: ~0.5–2ms
CDN fetch: ~20–50ms
Browser cache: ~0–1ms
🧠 Scalability
Reduce database load by 80–90%.
💰 Cost Savings
Lower DB reads, less compute, fewer servers.
🚨 Reliability & Fallback
If DB is slow or temporarily unavailable → cache can still serve data.
✨ Better User Experience
Faster pages → lower bounce rates → higher conversion.
😲 Real-World Examples
Twitter (X)
Trending topics are recalculated every few seconds, stored in Redis → millions of users see instantly.
Profile + feed metadata cached → reduces reads on sharded DB.
YouTube
Video metadata cached on edge → instant loads worldwide.
😎 Caching Patterns (VERY Important for Interviews + Real Systems)
1. Cache-Aside (Lazy Loading)
Most common in backend systems (Redis + Node.js).
Flow:
Check cache → if miss → fetch DB → store → return.
2. Read-Through Cache
Application always queries cache.
Cache itself fetches from DB on miss.
3. Write-Through Cache
Write to cache → cache writes to DB.
(Data always fresh, but slower writes)
4. Write-Back (Write-Behind)
Write to cache → return immediately → cache writes to DB later.
(Fast writes, but riskier)
😉 Cache Invalidation (One of the hardest problems)
This decides whether cache stays fresh or stale.
Invalidation Techniques:
- TTL (Time-to-live) Auto-expire after X seconds.
-
Manual Invalidate
redis.del('user:123') - Event-based On DB update → publish event → invalidate distributed caches.
-
Versioning
Add version numbers to keys:
posts:v2:latest
🔥 Common Issues & Solutions
1. Stale Data
Fix using TTL or events.
2. Cache Stampede / Thundering Herd
Millions request same “cold key” = DB burst.
Fix with:
- Locking (Redis SETNX)
- Stale-while-revalidate
- Request coalescing
3. Memory Pressure
Fix via eviction policies:
- LRU (least recently used)
- LFU
- FIFO
4. Cold Start
Warm cache using popular keys on boot.
🧐 How to Use Redis Caching in Backend (Node.js / Express Example)
This is the simplest and most widely used pattern:
import express from "express";
import Redis from "ioredis";
import User from "./models/User.js";
const app = express();
const redis = new Redis();
app.get("/user/:id", async (req, res) => {
const id = req.params.id;
const key = `user:${id}`;
// Check cache
const cached = await redis.get(key);
if (cached) {
return res.json(JSON.parse(cached));
}
// Fetch DB
const user = await User.findById(id);
if (!user) return res.status(404).json({ message: "User not found" });
// Save to cache
await redis.set(key, JSON.stringify(user), "EX", 300); // 5 min TTL
return res.json(user);
});
app.put("/user/:id", async (req, res) => {
const id = req.params.id;
// Update DB logic here...
// Invalidate cache
await redis.del(`user:${id}`);
res.json({ message: "Updated + Cache invalidated" });
});
app.listen(3000);
🧐 Frontend Caching (React / Next.js)
Use React Query / SWR for app-level caching.
Example with React Query:
const { data } = useQuery({
queryKey: ["user", id],
queryFn: () => axios.get(`/api/user/${id}`).then(res => res.data),
staleTime: 5 * 60 * 1000, // 5 min fresh
});
Browser-level caching via headers:
Cache-Control: public, max-age=3600
ETag
Last-Modified
Storage API options:
- localStorage (persistent)
- sessionStorage (session only)
- IndexedDB (large data)
- Service Workers (PWA offline caching)
- Next.js built-in image caching
🤨 CDN Caching
Used for:
- Static assets (JS, CSS, images)
- API responses (if allowed)
- Edge caching for SSR (Next.js + Cloudflare)
Cache-Control example for static assets:
Cache-Control: public, max-age=31536000, immutable
🔥 When Should You Use Caching?
Use caching when:
✔️ Same data requested repeatedly
✔️ DB/API becomes bottleneck
✔️ Read-heavy systems
✔️ Expensive computations
✔️ Static/semi-static data
✔️ Large traffic spikes
Avoid caching:
❌ Highly volatile data (stock prices, payments)
❌ Data requiring strong consistency
❌ User-sensitive content on shared caches
If you're following along with this Architecture Series and happened to miss the earlier parts, no worries — you can dive into them anytime. Each part builds your foundation step by step. Here are the previous topics we covered:
1️⃣ Pagination — Architecture Series: Part 1
2️⃣ Indexing — Architecture Series: Part 2
3️⃣ Virtualization — Architecture Series: Part 3
Top comments (0)