Caching in NodeJS with Redis

Speed performance is critical in modern-day applications. Most of the time your application may have to make requests to an endpoint/server to get some data. Or your application may frequently request a specific dataset.

Depending on the size of your dataset along with other things like query database and network speed, the speed at which your application can fetch certain data to display to users may get slower over time (as data increases).

This is where caching comes in handy and can dramatically improve your application’s speed. In this tutorial, we will look at how to implement Redis caching in a node application (API) to improve the speed at which we are able to serve data to client apps. Let’s dive in!

What is Caching

Before we dive into creating our own Redis cache, we have to first answer the question of what is caching?
In computing, a cache is a high-speed data storage layer which stores a subset of data, typically transient(existing for a short period of time) in nature, so that future requests for that data are served up faster than is possible by accessing the data’s primary storage location. Caching allows you to efficiently reuse previously retrieved or computed data

Here’s a scenario to help you think about how caching works. Imagine you’re watching your favorite sport (Soccer/Football for me ) or the news or a movie.

You’re also a big fan of Potato chips, so you decide that every 15 minutes you’ll go to the kitchen to get a little potato chips to eat.

You then noticed that going to the kitchen every 15 minutes is time-consuming, not to mention the fact that you miss out a minute or two of what you’re watching.

So instead of repeatedly making trips to the kitchen, you decide to fill a large bowl with chips and have it right next to you while watching the TV.

Now you can get your chips much faster and you don’t need to go back to the kitchen unless your bowl is empty or you want a different kind of chips/snack. That bowl of chips is your cache.

In the world of IT, caching is similar. Data is stored somewhere (the bowl) where it can be accessed fast without having to go to the original source (the kitchen) unless the data needed is not inside the bowl.

What is Redis

Seeing that we will be using Redis to implement caching inside our node application it makes sense to first discuss what is Redis. Redis is an in-memory key-value pair database. Yes, you read that right, Redis stores data in memory (RAM).
Reading and writing to RAM is magnitudes faster than reading from a disk drive. This makes Redis perfect for caching.

Benefits of caching

  • Redis cache uses RAM as its storage (more on this further down) which is faster than disk storage, reading from cache is extremely fast. Being able to read data at higher rates will significantly improve application performance.
  • Caching frequently requested data will result in a reduction in the number of database queries needed retrieved particular data.
  • Following up on the previous benefit, if we are making fewer database queries or even fewer network request to fetch external resources then our application will have lower latency.
  • Your application can scale better since you can cache that is requested more frequently as more person uses your application.

Node App

Now that we understand what caching is and got an introduction to Redis we will create a node application that utilizes caching through Redis.
Our application will be a simple e-commerce API server that allows users to fetch a list of products. Let’s start coding!

  • Create a folder name node-redis-cache (or whatever you like )
  • open the folder inside your text editor(I use VScode)

We’ll be using a few npm packages in our app:

  • express – handle routing in our app
  • redis – use redis commands in node
  • axios – to make API calls

Open your terminal inside the project folder (node-redis-cache) and run the following command to install the needed packages:

npm install express redis axios

The command above installs the express, redis, and axios packages

Create Server

Now lets finally write some code. We’ll first create our express server. Create a file name index.js.

Add the following imports to index.js

const express = require('express'); 
const app = express();
const axios = require('axios');
const PORT = 9000;

const redis = require("redis");
const cacheClient= redis.createClient(); // redis client used to interact with redis database

app.listen(PORT, () => console.log(`listening on port ${PORT}`));

We’ve created our server and set it to listen on port 9000. We’ve also required in the redis and axios packages which we’ll use later on.


Now we’ll add a route that returns a list of products to the user. Add the following to index.js

app.get('/products', async (req, res) => {
  const { data } = await axios.get(''); // This is a real API ;)
  return res.send(data);

Here we’ve created a route handler for /products that will return a list of products. We are making a request to an external API to get these products.

Lets assume that this external API also makes a database request get this list of products. As you can see, when a user requests the list of available products it may take a while for them to get a response.

API Speed (without cache)

Let’s test our endpoint using postman (or your favorite API testing tool). This will show us the speed performance of our application without caching.

Without caching implemented our API request takes 1540 milliseconds (or 1.54 seconds) to be processed. Now let’s add caching to our route handler.

Adding caching

Update the /products route handler to the following.

app.get('/products', async (req, res) => {
    const TIME_TO_LIVE = 1800; // 30 minutes as seconds

    cacheClient.get("products", async (err, cachedProducts) => {
        if (cachedProducts) {
        } else {
            const { data } = await axios.get('');
            cacheClient.setex("products", TIME_TO_LIVE, JSON.stringify(data))
            return res.send(data);

Here, we are changing how our /products route handler operates. When we get a request to fetch for products, we first check to see if we already have that data available in cache.

If the cached data is available then we return that to the user. If there’s no cached data available, we first make a call to the external API for the data. Then we cache the newly fetched data.

The setex() method allows us to set the Time To Live (TTL) for the cached data. This means that after the specified amount of time (in seconds) the cached data will be deleted. Finally we return the data to the user.

API Speed (with cache)

This will show us the speed performance of our application with caching implemented. Lets make a call to the API endpoint.

Whoa, wait, that’s not much faster! Why is that so? On the first request, there’s no data in the cache, so we’d have to then make a call to the external API which will take some time. The fetched data is then cached and is available on subsequent requests. So, let’s make another request.

Now, look at that! Our request took only 2 milliseconds (or 0.002 seconds). That’s fast, very fast compared to when we don’t have cache implemented.

As you can see, caching can significantly improve your application’s performance. There’s a lot more to Redis caching and caching in general. If you’re interested in learning more about caching and Redis, see the following links:

If you found this helpful leave a comment below and share it with devs who will find it useful. Until next time think, learn, create, repeat.

About the Author

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like these