How to Speed Up Your Payload CMS Site With unstable_cache
A practical guide to boosting performance in Next.js 15 by caching Payload CMS queries with unstable_cache and automated cache invalidation.

The Challenge I Faced
Working with Payload CMS in a Next.js project, I ran into a performance bottleneck. Every time someone visited the website, I was hitting the database multiple times to fetch the same content: service pages, project galleries, testimonials, and page data. These repeated database calls were slowing down the site.
The problem became noticeable when I had multiple components on a single page each making their own database requests. A homepage might trigger 10 or more separate database queries just to load all the content blocks. While this works fine during development with a local database, it becomes a performance issue in production.
Many developers working with headless CMS solutions face this challenge. You want the flexibility of dynamic content management, but you also need the performance that comes with cached data. The question becomes: how do you cache your CMS queries effectively without serving stale content to your users?
My Approach
I decided to implement a caching strategy using Next.js's unstable_cache
function. This approach allows me to cache database queries on the server side while maintaining control over when and how that cache gets invalidated.
My solution involved three key components:
First, I moved all data-fetching functions into dedicated files and wrapped each one with unstable_cache
. This gave me server-side caching that persists across requests and serves all users from the same cached data.
Second, I created a systematic approach to cache keys and tags. Every cached function gets a predictable cache key, and related functions share cache tags that allow me to invalidate groups of cached data together.
Third, I connected cache invalidation directly to Payload CMS's lifecycle hooks. Whenever content gets updated in the CMS, the cache automatically clears the relevant cached data, ensuring users always see fresh content.
This approach works well because it improves performance while requiring minimal changes to the existing codebase. I get fast page loads from cached data, but I never serve outdated content to users.
Understanding Cache Keys, Tags, and Key Parts
Before diving into the implementation, it's important to understand how Next.js caching works with these three concepts.
Cache keys are unique identifiers for each cached response. When you call a cached function with specific parameters, Next.js generates a cache key based on those parameters and stores the result. The next time the same function gets called with the same parameters, Next.js returns the cached result instead of executing the function again.
Key parts are additional strings you can provide to make cache keys more specific. While Next.js automatically generates cache keys from function parameters, key parts let you add extra identification. This is useful when you want to ensure certain cached data doesn't accidentally conflict with other cached data.
Cache tags are labels you assign to cached data. Multiple cached responses can share the same tag, and you can invalidate all cached data with a specific tag at once. This is powerful for content management scenarios where updating one piece of content might affect multiple pages or components.
For example, when you update a service in your CMS, you might want to invalidate the cache for the services list, that specific service's detail page, and any pages that display related services. Cache tags make this kind of coordinated cache invalidation straightforward.
Cache Invalidation with Payload CMS Hooks
Payload CMS provides lifecycle hooks that fire when content changes. The afterChange
hook runs every time a document gets created, updated, or deleted. This is the perfect place to trigger cache invalidation.
The benefit of this approach is that cache invalidation happens automatically. Content editors can update pages, services, or testimonials through the CMS interface, and the website automatically serves fresh content without any manual cache clearing.
This automatic cache invalidation is crucial for maintaining content accuracy. Without it, you'd need to manually clear caches every time content changes, or you'd risk serving outdated information to your users. By connecting cache invalidation to CMS hooks, you get the performance benefits of caching with the content freshness of direct database queries.
Step-by-Step Implementation
Let's walk through implementing this caching strategy in your own project.
Step 1: Create Cache Key Utilities
First, create a separate file for your cache key functions. This file should not have the "use server" directive since these are utility functions, not server actions.
// src/lib/payload/cache-keys.ts
// Collection-level cache keys
export const SERVICES_TAG = () => "services";
export const TESTIMONIALS_TAG = () => "testimonials";
export const MEDIA_TAG = () => "media";
// Item-specific cache keys
export const SERVICE_BY_ID_TAG = (serviceId: number) => `service-by-id-${serviceId}`;
export const TESTIMONIAL_BY_ID_TAG = (testimonialId: number) => `testimonial-by-id-${testimonialId}`;
// Page-specific cache keys
export const PAGE_BY_SLUG_TAG = (slug?: string[]) => {
const slugStr = slug && slug.length > 0 ? slug.join('-') : 'home';
return `page-${slugStr}`;
};
These functions generate consistent cache keys throughout your application. The collection-level tags let you invalidate all services or testimonials at once, while the ID-specific tags let you target individual items.
Step 2: Wrap Your Data Functions with unstable_cache
Next, update your data-fetching functions to use caching. Import the cache utilities and wrap each function with unstable_cache
.
// src/lib/payload/index.ts
"use server"
import { unstable_cache } from 'next/cache'
import { SERVICES_TAG, SERVICE_BY_ID_TAG } from './cache-keys'
export const getServices = unstable_cache(
async (serviceIds: number[]) => {
const payload = await getPayloadClient()
const services = await payload.find({
collection: 'services',
where: { id: { in: serviceIds } },
})
return services.docs
},
[SERVICES_TAG()],
{
tags: [SERVICES_TAG()],
revalidate: false
}
)
export const getService = unstable_cache(
async (serviceId: number) => {
const payload = await getPayloadClient()
const result = await payload.findByID({
collection: 'services',
id: serviceId,
})
return result
},
[SERVICE_BY_ID_TAG(0)], // Will be overridden by function args
{
tags: [SERVICES_TAG()],
revalidate: false
}
)
The first parameter is your original async function. The second parameter is an array of key parts that help identify this cached data. The third parameter includes tags for invalidation and revalidation settings.
Setting revalidate: false
means the cache will persist until manually invalidated. This is perfect for CMS content that only changes when editors make updates.
Step 3: Create Cache Invalidation Hooks
Now create hooks that will clear your cache when content changes. These hooks use Payload's afterChange
and afterDelete
lifecycle events.
// src/collections/hooks/revalidateServicesCache.ts
import { revalidateTag } from 'next/cache'
import { CollectionAfterChangeHook, CollectionAfterDeleteHook } from 'payload'
import { SERVICES_TAG, SERVICE_BY_ID_TAG } from '../../lib/payload/cache-keys'
export const revalidateServicesCache: CollectionAfterChangeHook = async ({
doc,
previousDoc,
operation
}) => {
try {
// Always revalidate general services tag
revalidateTag(SERVICES_TAG())
// Revalidate specific service cache
if (doc.id) {
revalidateTag(SERVICE_BY_ID_TAG(doc.id))
}
// If slug changed, also revalidate the old slug
if (operation === 'update' && previousDoc?.slug && previousDoc.slug !== doc.slug) {
revalidateTag(SERVICE_BY_ID_TAG(previousDoc.id))
}
console.log(`Revalidated services cache for: ${doc.title}`)
} catch (error) {
console.error('Error revalidating services cache:', error)
}
}
This hook runs after any service gets created, updated, or deleted. It clears both the general services cache and the specific service cache, ensuring that any page displaying services will show the updated content.
Step 4: Connect Hooks to Your Collections
Finally, add these hooks to your Payload collections. Import the hook functions and add them to the hooks configuration.
// src/collections/Services.ts
import { revalidateServicesCache, revalidateServicesCacheDelete } from './hooks/revalidateServicesCache';
export const Services: CollectionConfig = {
slug: 'services',
// ... other configuration
hooks: {
afterChange: [revalidateServicesCache],
afterDelete: [revalidateServicesCacheDelete],
},
// ... fields and other settings
};
Now whenever someone updates a service through the Payload admin interface, the cache automatically clears and fresh data gets served to your website visitors.
Step 5: Handle Related Content
For more complex scenarios, you might need to invalidate cache for related content. For example, when a service changes, you might also want to clear cache for pages that display that service.
export const revalidateServicesCache: CollectionAfterChangeHook = async ({
doc,
operation
}) => {
try {
// Revalidate services
revalidateTag(SERVICES_TAG())
revalidateTag(SERVICE_BY_ID_TAG(doc.id))
// Also revalidate service pages that might display this service
revalidateTag('service-pages')
// If this service has a dedicated page, revalidate that too
if (doc.slug) {
revalidateTag(`service-page-${doc.slug}`)
}
} catch (error) {
console.error('Error revalidating services cache:', error)
}
}
This ensures that changes cascade properly through your content hierarchy.
Results and Benefits
Implementing this caching strategy transformed my website performance. Page load times dropped because most database queries now serve from cache. At the same time, content editors can update information through the CMS and see changes reflected immediately on the website.
The system is also maintainable because all caching logic is centralized and follows consistent patterns. Adding caching to a new collection involves creating cache key functions, wrapping the data functions with unstable_cache
, and adding invalidation hooks.
This approach scales well. As your content grows, the performance benefits become more pronounced, while the cache invalidation ensures that users always see accurate information.
Conclusion
Caching Payload CMS queries with unstable_cache
provides a good balance between performance and content freshness. The implementation requires some initial setup, but the results are worth the effort.
This pattern works well for any Next.js project using a headless CMS. The key is connecting your cache invalidation directly to your content management lifecycle, ensuring that performance optimizations never come at the cost of content accuracy.
Consider implementing a similar approach in your own projects. Start with your most frequently accessed content, like pages and navigation data, then expand the caching to other collections as you see the benefits. Your users will appreciate the faster load times, and your content editors will appreciate that their updates appear immediately without any additional steps.
Thanks, Matija