
Next.js 16 Cache Components: The Feature Nobody Understands Yet
April 10, 2026
Next.js 16 ships a new caching model called Cache Components. It replaces unstable_cache, removes automatic fetch caching, and gives you a single directive to control what gets cached and for how long. If your mental model is still "Next.js caches everything by default," that changed.
This post covers what Cache Components actually do, how to use the three new APIs, and how to migrate existing code without breaking your app.
What Cache Components Change#
Before Next.js 16, the framework cached aggressively by default. Every fetch call was cached unless you opted out. unstable_cache wrapped functions in a caching layer with manual key arrays. The behavior surprised people constantly.
Cache Components flip the default. With cacheComponents: true in your config, nothing is cached unless you say so. Data fetching happens at request time. You opt in to caching per-component, per-function, or per-page using the "use cache" directive.
This is the single biggest mental model shift:
- Old: everything cached, opt out with
no-store - New: nothing cached, opt in with
"use cache" - Result: no more surprise stale data in development
To enable it, add one line to next.config.ts:
const nextConfig = {
cacheComponents: true,
};
export default nextConfig;
That flag tells Next.js to treat all data fetching as dynamic by default. Only code marked with "use cache" gets included in the static shell at build time.
The "use cache" Directive#
The directive works like "use server" or "use client". Drop it at the top of a function, component, or file, and the compiler handles the rest.
Three levels of granularity:
Page-level caching. Cache an entire route.
"use cache";
export default async function BlogIndex() {
const posts = await fetchPosts();
return <PostList posts={posts} />;
}
Component-level caching. Cache one piece of a page while the rest stays dynamic.
async function RecentPosts() {
"use cache";
const posts = await db.posts.findMany({ take: 5 });
return (
<ul>
{posts.map(p => <li key={p.id}>{p.title}</li>)}
</ul>
);
}
Function-level caching. Cache a data-fetching function and reuse the result across components.
async function getCategories() {
"use cache";
return db.categories.findMany();
}
The compiler analyzes each function closure and arguments to generate a cache key automatically. You never write key arrays by hand. That was the most error-prone part of unstable_cache, and it is gone.
Quick Reference: Where to Place "use cache"
Controlling TTL with cacheLife#
The "use cache" directive caches content, but for how long? That is where cacheLife comes in. Import it from next/cache and call it inside any cached function to set the revalidation window.
Next.js ships with built-in profiles:
import { cacheLife } from "next/cache";
async function getProducts() {
"use cache";
cacheLife("hours");
return db.products.findMany();
}
The built-in profiles and their rough TTLs:
"seconds"revalidates every few seconds"minutes"revalidates every few minutes"hours"revalidates hourly"days"revalidates daily"weeks"revalidates weekly"max"caches as long as possible
You can define custom profiles in next.config.ts when the built-ins do not match your needs:
const nextConfig = {
cacheComponents: true,
cacheLife: {
catalog: {
stale: 3600,
revalidate: 900,
expire: 86400,
},
},
};
Then reference it by name: cacheLife("catalog"). Three numbers control the behavior. stale is how long the cached value serves without checking. revalidate triggers a background refresh. expire is the hard upper limit before the entry gets evicted.
Tag-Based Invalidation with cacheTag#
Revalidating on a timer works for most content. But when a user publishes a blog post or updates a product, you want the cache to clear immediately. That is what cacheTag does.
import { cacheTag } from "next/cache";
async function getPost(slug: string) {
"use cache";
cacheTag(`post-${slug}`);
cacheLife("days");
return db.posts.findUnique({ where: { slug } });
}
To invalidate, call revalidateTag in a Server Action or API route:
"use server";
import { revalidateTag } from "next/cache";
export async function publishPost(slug: string) {
await db.posts.update({ where: { slug }, data: { published: true } });
revalidateTag(`post-${slug}`);
}
Tags work across every cached function that shares the same tag string. One revalidateTag call can flush a post content, its metadata component, and the blog index all at once.

Migrating from unstable_cache#
If you are upgrading from Next.js 15, you probably have unstable_cache calls scattered through your codebase. The migration is mechanical.
Before (Next.js 15):
import { unstable_cache } from "next/cache";
const getCachedUser = unstable_cache(
async (id: string) => db.users.findUnique({ where: { id } }),
["user"],
{ revalidate: 3600, tags: ["users"] }
);
After (Next.js 16):
import { cacheLife, cacheTag } from "next/cache";
async function getCachedUser(id: string) {
"use cache";
cacheLife("hours");
cacheTag("users");
return db.users.findUnique({ where: { id } });
}
The key differences:
- No wrapper function. The directive goes inside the function body.
- No manual key arrays. The compiler generates keys from the function arguments and closure.
cacheLifereplaces therevalidatenumber. Use a profile name or define a custom one.cacheTagreplaces thetagsarray. Call it as a function instead of passing an options object.
Next.js ships a codemod that handles the unstable_ prefix removal automatically:
npx @next/codemod upgrade
That command renames unstable_cacheTag to cacheTag and unstable_cacheLife to cacheLife across your project. You still need to restructure the wrapping pattern by hand, but the rename is free.
How This Connects to Partial Prerendering#
Cache Components and Partial Prerendering (PPR) work together. PPR splits a single route into a static shell and dynamic holes.
The mental model:
- At build time, Next.js renders your component tree
- Components with
"use cache"become part of the static shell - Components wrapped in
<Suspense>become dynamic holes that stream at request time - The static shell ships instantly from the edge. Dynamic content fills in as it resolves.
A practical example mixing cached and dynamic content on one page:
export default function ProductPage({ params }) {
return (
<>
<ProductDetails id={params.id} /> {/* cached */}
<Suspense fallback={<CartSkeleton />}>
<UserCart /> {/* dynamic, personalized */}
</Suspense>
</>
);
}
ProductDetails uses "use cache" and gets baked into the shell. UserCart reads cookies and renders at request time inside the Suspense boundary. The user sees the product instantly while their cart streams in.
This replaces the old pattern of choosing between generateStaticParams for static pages and force-dynamic for everything else. You get both on the same route.
When to Use Each API#
Picking the right tool depends on your invalidation needs:
- Time-based content (product catalog, blog index):
"use cache"+cacheLife("hours") - Event-driven content (user publishes, admin updates):
"use cache"+cacheTag("entity-id")+revalidateTagin a Server Action - Fully dynamic content (shopping cart, user dashboard): skip
"use cache", wrap in<Suspense> - Static forever (legal pages, changelogs):
"use cache"+cacheLife("max")
The Next.js caching docs go deeper on edge cases. For teams migrating large codebases, start with cacheComponents: true and add "use cache" to your slowest pages first. Measure with Core Web Vitals before and after.
If you are building with React and AI tools, the caching model matters because AI code generators default to React patterns and will start producing "use cache" code soon. Understanding the mental model now saves you from debugging generated code later.
For teams on Windows, the Next.js 16 dev server with Turbopack starts 87% faster than 16.1. If your dev environment still runs on Webpack, the upgrade pays for itself in iteration speed alone. The pace of framework changes is not slowing down.