- How to Update Schema in Production with Payload CMS Without Losing Data
How to Update Schema in Production with Payload CMS Without Losing Data
A safe four-step approach to database schema changes using additive migrations and job queues

Changing your database schema in production without losing data or causing downtime is one of those challenges that keeps developers up at night. Last month, I needed to completely restructure how product variants were stored in a live e-commerce system built with Payload CMS. The existing approach was too rigid, but with thousands of products and active customers, I couldn't afford to lose a single data point or take the system offline.
After successfully implementing a four-step migration process that preserved every bit of data while transforming the underlying structure, I'm sharing the exact methodology. This approach works for any significant schema change in Payload CMS where you need to maintain data integrity and zero downtime.
By the end of this guide, you'll understand how to safely evolve any production schema using additive changes, job queues, and systematic verification.
The Schema Challenge: A Real Example
To demonstrate the methodology, let me walk you through the specific scenario I faced. My Payload setup had hardcoded product variant fields that looked like this:
// File: src/collections/ProductVariants.ts - The old way
{
name: 'color',
type: 'text',
required: false,
admin: {
description: 'Color variant (e.g., Siva boja)',
},
},
{
name: 'size',
type: 'text',
required: false,
admin: {
description: 'Size variant (e.g., 10×10, 20x30x6)',
},
}
This worked fine for basic products, but I kept running into the same issues:
- Adding new variant types required schema changes and code updates
- Each product type needed different hardcoded fields
- Validation logic was scattered and inflexible
- UI components had to be updated for every new variant type
The breaking point came when a client wanted to sell building materials that needed material
, thickness
, and finish
variants, while another wanted clothing with size
, color
, and fabric
. My hardcoded approach was crumbling. The challenge isn't just new capabilities—it's evolving the schema in production without losing data.
The Target Schema: Dynamic Structure
The core principle of safe schema evolution is designing your target structure before you begin. I wanted to achieve something like Shopify's variant system:
- Products define what variant types they support
- Individual variants provide values for those types
- The UI adapts automatically to each product's variant structure
- Adding new variant types doesn't require code changes
The target architecture looked like this:
// File: Target structure for collections
// Products collection - defines what variant types this product supports
variantOptionTypes: [
{ name: "color", label: "Barva" },
{ name: "size", label: "Velikost" }
]
// ProductVariants collection - provides values for those types
variantOptions: [
{ name: "color", value: "Siva boja" },
{ name: "size", value: "10×10" }
]
This structural transformation demonstrates the key challenge: how do you safely migrate from a rigid schema to a flexible one in production?
The Migration Strategy: Four-Step Safety Net
Safe schema evolution in production requires a methodical approach that provides safety at every step. The strategy I developed follows four phases:
- Add - Introduce new fields alongside existing ones
- Backfill - Populate new structure using Payload jobs
- Verify - Confirm data integrity before proceeding
- Remove - Clean up legacy fields only after verification
This additive approach ensures zero data loss, zero downtime, and rollback capability at each step. Let me show you exactly how it works using the product variant transformation.
Phase 1: Schema Addition (Expanding the Foundation)
Add new fields alongside old ones so the app stays functional while we prepare the backfill.
The first step was adding the new fields to both collections without removing the old ones. This way, the existing system continues working while we prepare the new structure.
Adding Product-Level Variant Types
// File: src/collections/Products.ts
{
name: 'variantOptionTypes',
type: 'array',
admin: {
description: 'Define what variant option types this product supports',
condition: (data) => data.hasVariants === true,
position: 'sidebar',
},
fields: [
{
name: 'name',
type: 'text',
required: true,
admin: {
placeholder: 'e.g., color, size, material',
description: 'Database field name (lowercase, no spaces)',
},
},
{
name: 'label',
type: 'text',
required: true,
admin: {
placeholder: 'e.g., Barva, Velikost, Material',
description: 'User-friendly label shown in UI',
},
},
],
}
This new field structure allows products to define their own variant types dynamically. The variantOptionTypes
array stores the available option names and labels, while the conditional display keeps the UI clean for products that don't need variants. Crucially, we're not removing the old color
and size
fields yet - the system continues working normally while we prepare the new structure.
Adding Dynamic Variant Options
// File: src/collections/ProductVariants.ts
{
name: 'variantOptions',
type: 'array',
admin: {
description: 'Variant option values - must match parent product variant types',
},
fields: [
{
name: 'name',
type: 'text',
required: true,
admin: {
description: 'Must match a variant option type defined in parent product',
},
},
{
name: 'value',
type: 'text',
required: true,
admin: {
description: 'The specific value for this variant (e.g., "Siva boja", "10×10")',
},
},
],
}
After adding these fields, I ran the Payload migration:
pnpm payload migrate:create pnpm payload migrate
The variantOptions
array replaces hardcoded fields with a flexible structure where each option has a name and value. This mirrors the structure defined in the parent product's variantOptionTypes
. Running the migrations creates the new database columns while preserving all existing data in the legacy color
and size
columns.
At this point, both old and new structures exist simultaneously - the application continues functioning normally while the new fields are ready for population.
Phase 2: Data Backfill via Payload Job Queue
Run a single job that reads the old fields and writes the new ones, using full Payload context and validation.
Here's where it gets interesting. Instead of writing raw SQL migrations, I used Payload's job queue system. This approach has several advantages:
✅ Full Payload API access - hooks, validation, and logging all work
✅ Error handling - failed jobs can be retried
✅ Monitoring - built-in logging and progress tracking
✅ Manual triggering - run when you're ready, not automatically
Creating the Migration Handler
// File: src/queues/handlers/migrateVariantOptionsHandler.ts
const migrateVariantOptionsHandler = async ({ req }: { req: any }) => {
req.payload.logger.info(`=== VARIANT OPTIONS MIGRATION STARTED ===`)
const stats = {
productsProcessed: 0,
variantsProcessed: 0,
errors: []
}
try {
// Step 1: Get all products that have variants
const productsWithVariants = await req.payload.find({
collection: 'products',
where: { hasVariants: { equals: true } },
limit: 1000
})
// Step 2: Process each product
for (const product of productsWithVariants.docs) {
try {
// Get variants for this product
const variants = await req.payload.find({
collection: 'product-variants',
where: { product: { equals: product.id } },
limit: 1000
})
// Determine what variant option types this product needs
const hasColors = variants.docs.some(v => v.color && v.color.trim())
const hasSizes = variants.docs.some(v => v.size && v.size.trim())
const variantOptionTypes = []
if (hasColors) variantOptionTypes.push({ name: 'color', label: 'Barva' })
if (hasSizes) variantOptionTypes.push({ name: 'size', label: 'Velikost' })
// Update product with variant option types
if (variantOptionTypes.length > 0) {
await req.payload.update({
collection: 'products',
id: product.id,
data: { variantOptionTypes }
})
}
// Step 3: Update each variant
for (const variant of variants.docs) {
const variantOptions = []
if (variant.color && variant.color.trim()) {
variantOptions.push({ name: 'color', value: variant.color })
}
if (variant.size && variant.size.trim()) {
variantOptions.push({ name: 'size', value: variant.size })
}
if (variantOptions.length > 0) {
await req.payload.update({
collection: 'product-variants',
id: variant.id,
data: { variantOptions }
})
stats.variantsProcessed++
}
}
stats.productsProcessed++
} catch (error) {
stats.errors.push(`Failed to process product ${product.title}: ${error.message}`)
}
}
return {
output: {
success: true,
message: 'Migration completed successfully',
stats
}
}
} catch (error) {
return {
output: {
success: false,
message: `Migration failed: ${error.message}`,
stats
}
}
}
}
The backfill job analyzes existing data to determine what variant types each product needs, then creates the appropriate structure. Products with only colors get [{name: 'color', label: 'Barva'}]
, products with both get both types.
Registering the Migration Job
// File: payload.config.ts
import migrateVariantOptionsHandler from '@/queues/handlers/migrateVariantOptionsHandler'
export default buildConfig({
jobs: {
tasks: [
// ... other tasks
{
slug: 'migrateVariantOptions',
label: 'Migrate Variant Options from color/size to flexible structure',
inputSchema: [],
handler: migrateVariantOptionsHandler
}
]
}
})
Creating an Admin UI Trigger
To make the migration easy to run, I created a component that appears in the Products admin:
// File: src/components/admin/MigrateVariantOptions.tsx
'use client'
import { Button } from '@payloadcms/ui'
import React, { useActionState } from 'react'
export default function MigrateVariantOptions() {
const [migrationState, migrationAction] = useActionState(triggerVariantOptionsMigration, initialState)
return (
<div className="p-4 border rounded-lg bg-yellow-50 mb-4">
<h3 className="text-lg font-semibold mb-2">⚠️ Migration Required</h3>
<p className="text-sm mb-4">
Migrate existing color/size variant data to new flexible structure.
<strong> Run this once after adding new schema fields.</strong>
</p>
<form action={migrationAction}>
<Button type="submit">
Migrate Variant Options
</Button>
</form>
{migrationState.message && (
<div className={`mt-4 p-3 rounded ${
migrationState.status === 'success'
? 'bg-green-100 text-green-800'
: 'bg-red-100 text-red-800'
}`}>
{migrationState.message}
</div>
)}
</div>
)
}
The admin component provides a simple interface for triggering the migration job with visual feedback about the progress. Registering the job in the Payload config makes it available through the job queue system, while the admin component placement ensures it's visible when managing products.
// File: src/collections/Products.ts
admin: {
components: {
beforeList: ['/components/admin/MigrateVariantOptions']
},
}
This setup leverages Payload's job queue system for reliable data processing. The job runs with full Payload context, ensuring all hooks and validation rules are applied during the migration. Once completed, the new variantOptions
and variantOptionTypes
fields contain the transformed data while the original fields remain unchanged.
Phase 3: Manual Verification (Trust, but Verify)
Confirm the backfill is correct by spot-checking the new arrays alongside old values.
Before proceeding, I needed to verify the migration worked correctly. I used SQL queries to check the database directly:
-- Check products have variantOptionTypes
SELECT title, "variantOptionTypes" FROM products WHERE "hasVariants" = true;
-- Check variants have variantOptions
SELECT "variantSku", "variantOptions", color, size FROM "product-variants" LIMIT 10;
The results showed that:
- Products now had
variantOptionTypes
arrays like[{"name":"color","label":"Barva"},{"name":"size","label":"Velikost"}]
- Variants had
variantOptions
arrays like[{"name":"color","value":"Siva boja"},{"name":"size","value":"10×10"}]
- Old
color
andsize
values were preserved alongside the new structure
⚠️ Common issue I encountered: Some variants referenced deleted products, causing "Product not found" errors. The migration continued successfully despite these warnings, as the error handling isolated each product's processing.
Phase 4: Schema Cleanup (Removing the Old)
Remove legacy fields only after verifying the new structure is correct.
Once I verified the data migration worked, it was time to remove the old fields. This required another database migration:
// File: src/collections/ProductVariants.ts - Remove old fields
// Delete these field definitions:
{
name: 'color',
type: 'text',
// ... removed
},
{
name: 'size',
type: 'text',
// ... removed
}
I also updated the admin columns to remove references to the old fields:
admin: {
defaultColumns: ['displayName', 'variantSku', 'inStock'], // removed 'color', 'size'
}
After running pnpm payload migrate:create
and pnpm payload migrate
, the old columns were dropped from the database.
Phase 5: Logic Updates (Making It Work)
Update server-side logic to read from the new fields; aim for minimal surface changes.
With the schema updated, I needed to update the application logic to use the new dynamic structure.
Updating Display Name Generation
I replaced the hardcoded field access with dynamic logic that reads from variantOptions
:
// New way - dynamic from variantOptions
if (data.variantOptions && Array.isArray(data.variantOptions)) {
data.variantOptions.forEach(option => {
if (option.value && option.value.trim()) {
parts.push(option.value);
}
});
}
Adding Validation
I added validation to ensure variant options match their parent product's defined types:
// Validate variant options against parent product's variant option types
const allowedOptionNames = product.variantOptionTypes?.map(t => t.name) || [];
for (const option of data.variantOptions) {
if (!allowedOptionNames.includes(option.name)) {
throw new Error(`Invalid variant option "${option.name}". Allowed options: ${allowedOptionNames.join(', ')}`);
}
}
Utility functions now read from variantOptions
instead of direct fields; signature stays the same, so UI continues working.
The Results: A Flexible, Future-Proof System
After completing all phases, I had successfully transformed the rigid, hardcoded variant system into a flexible, dynamic one:
Before vs After
Before:
// Fixed schema
color: "Siva boja"
size: "10×10"
// New product type = schema changes + code updates
After:
// Product defines structure
variantOptionTypes: [
{ name: "material", label: "Material" },
{ name: "thickness", label: "Debljina" },
{ name: "finish", label: "Završnica" }
]
// Variant provides values
variantOptions: [
{ name: "material", value: "Čelik" },
{ name: "thickness", value: "5mm" },
{ name: "finish", value: "Polirano" }
]
Real-World Benefits
- Zero data loss - every existing variant preserved during transformation
- Zero downtime - system remained functional throughout migration
- Future-proof - new product types require no code changes
Key Takeaways: The 4-Step Migration Pattern
This process demonstrated a reliable pattern for production schema evolution:
Add → Backfill Job → Verify → Remove
- Add new fields alongside old ones (system stays functional)
- Run Payload job to backfill new structure from legacy data
- Verify migration with manual SQL checks before proceeding
- Remove legacy fields only after confirming new structure works
This additive approach provides rollback capability and confidence at each step.
Wrapping Up: Schema Evolution Without Fear
Transforming production schemas doesn't have to be a white-knuckle experience. The four-step migration pattern I've shown you - Add, Backfill, Verify, Remove - provides a safety net that eliminates the risk of data loss while maintaining zero downtime.
In this guide, we took a rigid product variant system and transformed it into a flexible, dynamic structure that can handle any product type without code changes. More importantly, we did it safely using Payload's job queue system and additive schema changes.
You now have a proven methodology for evolving any Payload CMS schema in production. Whether you're restructuring user profiles, transforming content relationships, or redesigning data models, this pattern gives you the confidence to make significant changes without breaking your live system.
The key insight is using your framework's tools - Payload jobs, migrations, and validation - rather than fighting against them with raw SQL. Your future self will thank you for taking the methodical approach.
Let me know in the comments if you have questions about implementing this pattern in your own projects, and subscribe for more practical development guides.
Thanks, Matija