In‑Memory Caching in Next.js: Simple Strategies with 1‑Hour TTL

Explore three ways to add a one‑hour in‑memory cache to Next.js—using node‑cache, a lightweight Map wrapper, or built‑in unstable_cache—and get deployment tips.

Why an In‑Memory Cache in Next.js?

When a page or API route needs data that is expensive to fetch (DB query, third‑party API, heavy computation), caching the result can dramatically improve response times and reduce load.

In a Next.js project you can keep the cached data in the server’s memory, and have it automatically expire after one hour.
Below are three practical approaches, each with its own trade‑offs.


1️⃣ The Safe & Feature‑Rich Way – node-cache

When to use it

  • You prefer a battle‑tested library that handles TTL, cleanup, and statistics for you.
  • Your app runs on a long‑living process (Docker, VPS, dedicated server) – the cache will stay warm between requests.

Installation

npm install node-cache   # or: yarn add node-cache

Create a Singleton Cache

// lib/cache.js
const NodeCache = require('node-cache');

// stdTTL: 3600 seconds = 1 hour
// checkperiod: how often expired keys are purged (10 minutes here)
const cache = new NodeCache({ stdTTL: 3600, checkperiod: 600 });

module.exports = cache;

Using the Cache in an API Route (or Server Component)

// app/api/example/route.js
import cache from '@/lib/cache';

export async function GET() {
  const key = 'expensive-data';
  const cached = cache.get(key);

  if (cached) {
    console.log('✅ Served from cache');
    return Response.json(cached);
  }

  console.log('🔄 Fetching fresh data');
  const freshData = {
    data: 'Result of a slow operation',
    timestamp: Date.now(),
  };

  // Stored with the default 1‑hour TTL
  cache.set(key, freshData);

  return Response.json(freshData);
}

Pros

  • Zero‑boilerplate TTL handling.
  • Automatic expiration & cleanup.
  • Optional stats (cache.getStats()) for monitoring.

Cons

  • Adds a tiny runtime dependency.

2️⃣ Zero‑Dependency Native Wrapper (Map + Manual TTL)

When to use it

  • You want to avoid external packages.
  • Your caching needs are simple (single key or a handful of keys).

Cache Utility

// lib/simpleCache.js
const globalForCache = globalThis;           // keep cache across HMR in dev
const ONE_HOUR = 60 * 60 * 1000;             // ms

class SimpleCache {
  constructor() {
    if (!globalForCache.simpleCache) {
      globalForCache.simpleCache = new Map();
    }
    this.cache = globalForCache.simpleCache;
  }

  get(key) {
    const entry = this.cache.get(key);
    if (!entry) return null;

    // Expired?
    if (Date.now() > entry.expiry) {
      this.cache.delete(key);
      return null;
    }
    return entry.value;
  }

  set(key, value, ttl = ONE_HOUR) {
    const expiry = Date.now() + ttl;
    this.cache.set(key, { value, expiry });
  }

  // Optional: clear everything manually
  flushAll() {
    this.cache.clear();
  }
}

export const simpleCache = new SimpleCache();

Using It in an API Route

// app/api/example/route.js
import { simpleCache } from '@/lib/simpleCache';

export async function GET() {
  const key = 'my-data';
  let data = simpleCache.get(key);

  if (!data) {
    // Simulate an expensive fetch
    data = await fetch('https://api.example.com/slow')
                .then(r => r.json());

    simpleCache.set(key, data);   // expires in 1 hour by default
  }

  return Response.json(data);
}

Pros

  • No external libraries.
  • Full control over expiration strategy.

Cons

  • You must manage cleanup (the checkperiod pattern is omitted for brevity).
  • Lacks advanced features like stats or bulk operations.

3️⃣ The “Next.js‑Native” Approach – unstable_cache

When to use it

  • You are on Next.js 13+ (App Router) and want to leverage the framework’s built‑in caching layer.
  • The data source is a function you can wrap (e.g., Prisma query, fetch).

Example

import { unstable_cache } from 'next/cache';

// The wrapped data‑fetcher
const getExpensiveData = unstable_cache(
  async () => {
    // Replace with any async operation (DB, external API, etc.)
    return await fetch('https://api.example.com/heavy')
               .then(r => r.json());
  },
  ['expensive-data-key'],   // cache‑key(s) – change to invalidate manually
  {
    revalidate: 3600,       // 1 hour in seconds
  }
);

export async function GET() {
  const data = await getExpensiveData();
  return Response.json(data);
}

How it works

  • unstable_cache stores the resolved value in a server‑side cache.
  • The revalidate option tells Next.js to treat the entry as stale after the given number of seconds, after which a fresh call will repopulate it.
  • Under the hood Next.js may persist the cache to the filesystem or an external store depending on deployment, but during a warm server process it behaves like an in‑memory cache.

Pros

  • No extra code or third‑party packages.
  • Integrated with Next.js’s rendering pipeline, so data is automatically deduped across concurrent requests.

Cons

  • Still experimental (unstable_*).
  • Behavior can differ between Vercel’s edge runtime and a traditional Node server.

📦 Serverless vs. Long‑Running Processes – The Big Caveat

Hosting model In‑Memory Cache Lifetime Data Sharing Between Instances
Vercel / Netlify / Edge Reset each time a function “cold‑starts”. May disappear after a few minutes of inactivity. No – each instance has its own isolated memory.
Docker / VPS / Dedicated Server Persists as long as the process stays alive (typically days/weeks). Yes – all requests hit the same process, so the cache is shared.

If you plan to deploy on a serverless platform, an in‑memory cache can still be useful for per‑instance hot paths, but you should not rely on it for global state. For guaranteed cross‑instance caching, consider an external store (Redis, Memcached, or a CDN edge cache).


✅ Quick Decision Checklist

Need Recommended Option
Minimal effort, robust TTL & cleanup node-cache
Zero dependencies, simple use‑case Native Map wrapper
Already using App Router & want a framework‑native solution unstable_cache
Deploying on serverless & need cross‑instance cache External store (Redis, etc.) – not covered here

🚀 Wrap‑Up

Implementing an in‑memory cache that expires after one hour in a Next.js app is straightforward once you understand the runtime environment. By picking the right strategy—whether a battle‑tested library, a lightweight custom wrapper, or the built‑in unstable_cache—you can drastically cut latency for heavy data fetches while keeping your codebase clean and maintainable.

Feel free to adapt the snippets above to fit your own data sources, cache keys, and TTL requirements. Happy caching!

Made with chatblogr.com