There are days in a developerβs journey when a single change makes a huge impact. Today was one of those days for me. I implemented Redis caching in my Node.js project, and the results were nothing short of amazing.
Let me share the story, the process, and how you can do it too.
π The Problem: Slow Responses and High Server Load
Like many backend developers, I faced a common problem:
- My API responses were starting to feel slower.
- The database was being queried repeatedly for the same data.
- The server cost was going up since every request needed fresh computation.
The app was working fine, but it wasnβt scalable. I needed a smarter solution.
π‘ The Solution: Redis Caching
Redis is an in-memory data store thatβs perfect for caching. Instead of hitting the database every single time, I could store frequently accessed data in Redis and serve it directly.
Benefits I was aiming for:
- β‘ Speed β Faster response times for repeated requests.
- π» Efficiency β Reduced load on the database and server.
- π° Cost Saving β Less compute power, fewer resources needed.
βοΈ Step-by-Step: How I Implemented Redis in Node.js
Hereβs how I set it up:
1οΈβ£ Install Redis and Client Library
First, I installed Redis locally (or you can use Docker, AWS ElastiCache, or Redis Cloud).
Then in Node.js:
npm install redis
2οΈβ£ Connect Redis to Node.js
In my project, I created a redis.js file:
const redis = require("redis");
const client = redis.createClient();
client.on("error", (err) => console.error("Redis Client Error", err));
(async () => {
await client.connect();
})();
module.exports = client;
3οΈβ£ Add Caching to an API Route
In my countries endpoint, I applied caching logic:
const client = require("./redis");
app.get("/countries", async (req, res) => {
try {
// Check cache
const cacheData = await client.get("countries");
if (cacheData) {
console.log("Serving from Redis cache");
return res.json(JSON.parse(cacheData));
}
// If not in cache, fetch from DB
const data = await db.query("SELECT * FROM countries");
// Store in Redis for 1 hour
await client.setEx("countries", 3600, JSON.stringify(data));
console.log("Serving fresh from DB");
res.json(data);
} catch (err) {
res.status(500).json({ error: "Something went wrong" });
}
});
π₯ The Result
After this implementation:
- First request β Fetches data from DB & stores it in Redis.
- Subsequent requests β Served directly from Redis in milliseconds.
- Database load dropped significantly.
- My EC2 server CPU usage fell, which also means lower costs.
In short:
- π Faster response times
- π Reduced server load
- π° Saved money on resources