- Build Payload CMS Custom Array Fields: Spreadsheet UI
Build Payload CMS Custom Array Fields: Spreadsheet UI
Step-by-step guide to build spreadsheet-style Payload CMS array fields with CSV import/export and debounced inputs

Need Help Making the Switch?
Moving to Next.js and Payload CMS? I offer advisory support on an hourly basis.
Book Hourly AdvisoryRelated Posts:
Introduction
I spent three hours one Friday afternoon wrestling with Payload CMS's form system, trying to build a product comparison table for an e-commerce site. The feature required managing 24 specifications across 7 product variants—168 data points that users needed to edit and compare. Payload's default nested array UI showed each variant in a collapsed accordion, making it impossible to see more than one specification at a time. Comparing values across products meant clicking through endless accordions and memorizing data.
After struggling with persistence bugs, performance issues, and Payload's underdocumented form hooks, I finally cracked it. The solution was a custom table-based field component that replaces Payload's array UI with a spreadsheet-like interface, complete with CSV import/export—all client-side, no server hooks or blob storage needed.
This guide shows you exactly how to build production-grade custom array field components in Payload CMS. You'll learn the patterns that work, the gotchas to avoid, and how to deeply integrate with Payload's form system for features like validation, dirty state tracking, and auto-save.
The Problem: Payload's Array UI Doesn't Work for Tabular Data
Let's say you're building a product specification comparison feature. You have:
- Specification rows: Array of spec names (e.g., "Pixel Pitch", "Brightness", "Weight")
- Product variants: Array of products, each with values for every specification
Here's what the data structure looks like:
{
rows: [
{ id: "1", name: "Pixel Pitch" },
{ id: "2", name: "Brightness" },
{ id: "3", name: "Weight" }
],
variants: [
{
id: "v1",
title: "Model FS-5",
sku: "FS5",
variantData: [
{ id: "d1", value: "5mm" },
{ id: "d2", value: "5000 nits" },
{ id: "d3", value: "28kg" }
]
},
// More variants...
]
}
Payload's default UI shows this as:
- An accordion for each variant
- Inside each accordion, another nested accordion for
variantData - You have to click through multiple levels to see a single value
Specific UX problems:
- Can't see all data at once - Only one accordion is expanded at a time
- Can't compare across variants - Values are buried in different accordions
- No visual relationship - The connection between spec rows and variant values is invisible
- Manual entry is painful - Entering 168 data points one-by-one in nested accordions is tedious
What we need instead: A horizontal table where specifications are columns, variants are rows, and every cell is editable inline. Think Google Sheets, not nested forms.
Understanding Payload's Custom Field Components
Before we build anything, you need to understand how Payload lets you replace its default UI.
The admin.components.Field Pattern
In any Payload field definition, you can specify a custom React component to render instead of the default UI:
{
name: 'myArrayField',
type: 'array',
admin: {
components: {
Field: '/src/components/payload/custom/MyCustomField'
}
},
fields: [
// Field schema still required for data structure
]
}
Important: The fields array is still required. It defines the data schema. Your custom component only controls the UI.
Payload's Form Hooks: The Foundation
Custom field components use three main hooks from @payloadcms/ui:
1. useField()
Binds a single input to Payload's form state. This is how you get validation, dirty state tracking, and persistence.
import { useField } from '@payloadcms/ui'
const MyInput = ({ path }: { path: string }) => {
const { value, setValue } = useField<string>({ path })
return (
<input
value={value || ''}
onChange={(e) => setValue(e.target.value)}
/>
)
}
Critical: Every input in your custom component must use useField() if you want changes to save. This is the most common mistake.
2. useForm()
Provides actions for manipulating arrays and accessing global form state.
import { useForm } from '@payloadcms/ui'
const MyComponent = () => {
const { addFieldRow, removeFieldRow } = useForm()
const handleAdd = () => {
addFieldRow({
path: 'myArray',
schemaPath: 'myArray'
})
}
}
3. useFormFields()
Reads values from other fields in the form. Use this when your component depends on sibling fields.
import { useFormFields } from '@payloadcms/ui'
const MyComponent = () => {
// Good: Only re-renders when this specific field changes
const rows = useFormFields(([fields]) => fields['layout.0.rows'])
// Bad: Re-renders on EVERY form change
const allFields = useFormFields(([fields]) => fields)
}
Performance tip: Always use a selector function to avoid re-rendering on every keystroke in the entire form.
TypeScript Types
Your component should use the ArrayFieldClientComponent type:
import type { ArrayFieldClientComponent } from 'payload'
const MyCustomField: ArrayFieldClientComponent = (props) => {
const { field, path } = props
// Your component logic
}
export default MyCustomField
Building the Simple Version: RowsField
Let's start by building a single-column table for managing specification names. This establishes the core patterns we'll reuse for more complex tables.
Component Setup
Create src/components/payload/custom/RowsField.tsx:
'use client'
import React, { useEffect, useMemo, useState, useCallback, useRef } from 'react'
import type { ArrayFieldClientComponent } from 'payload'
import { useFormFields, useForm, useField } from '@payloadcms/ui'
import { Button } from '@payloadcms/ui'
import { FieldLabel } from '@payloadcms/ui/fields/FieldLabel'
const RowsField: ArrayFieldClientComponent = (props) => {
const { field, path } = props
return (
<div className="field-type array">
<FieldLabel
label={field.label || 'Rows'}
required={field.required}
/>
{/* Table UI will go here */}
</div>
)
}
export default RowsField
Data Binding: Reading the Array
Payload stores array data in two formats depending on context:
- Array format:
variants = [{ id: '1', title: 'Foo' }] - Flattened format:
variants.0.id,variants.0.title, etc.
Your component needs to handle both:
const RowsField: ArrayFieldClientComponent = (props) => {
const { field, path } = props
const { setValue } = useField({ path })
// Get all form fields
const allFields = useFormFields(([fields]) => fields)
// Get the field value (could be array or number representing length)
const rowsField = allFields[path]
const rowsValue = rowsField?.value
// Reconstruct rows from either format
const rows = useMemo(() => {
if (Array.isArray(rowsValue)) {
return rowsValue // Already an array
}
// Flattened format: reconstruct from individual fields
const rowsLength = typeof rowsValue === 'number' ? rowsValue : 0
const reconstructed = []
for (let i = 0; i < rowsLength; i++) {
const idField = allFields[`${path}.${i}.id`]
const nameField = allFields[`${path}.${i}.name`]
if (idField || nameField) {
reconstructed.push({
id: String(idField?.value || ''),
name: String(nameField?.value || ''),
})
}
}
return reconstructed
}, [rowsValue, allFields, path])
return <div>{/* Use rows array here */}</div>
}
Why this is necessary: Payload's internal form state may represent your array differently based on whether it's being edited, validated, or saved. This dual-mode reading ensures your component always gets the data.
Add/Remove Operations
Use addFieldRow() and removeFieldRow() from useForm():
// Helper to generate unique IDs
function generateId(): string {
const timestamp = Math.floor(Date.now() / 1000).toString(16).padStart(8, '0')
const random = Array.from({ length: 16 }, () =>
Math.floor(Math.random() * 16).toString(16)
).join('')
return timestamp + random
}
const RowsField: ArrayFieldClientComponent = (props) => {
const { field, path } = props
const { addFieldRow, removeFieldRow } = useForm()
const handleAddRow = () => {
addFieldRow({
path,
schemaPath: path // Important: tells Payload to use schema defaults
})
}
const handleRemoveRow = (rowIndex: number) => {
removeFieldRow({ path, rowIndex })
}
return (
<div>
<Button onClick={handleAddRow}>Add Row</Button>
{rows.map((row, index) => (
<div key={row.id}>
{/* Row UI */}
<Button onClick={() => handleRemoveRow(index)}>Remove</Button>
</div>
))}
</div>
)
}
Critical: Pass schemaPath to addFieldRow(). This tells Payload to initialize the new row using your field schema, including default values and nested structures.
Inline Editing with TextCell
Here's where it gets interesting. You need a sub-component for each editable cell that uses useField():
// TextCell: A single editable input bound to Payload's form
const TextCell = ({
path,
placeholder
}: {
path: string
placeholder?: string
}) => {
const { value, setValue } = useField<string>({ path })
const [localValue, setLocalValue] = useState(value || '')
// Sync local state when external value changes (e.g., initial load)
useEffect(() => {
if (value !== undefined && value !== localValue) {
setLocalValue(value)
}
}, [value])
const handleChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const newVal = e.target.value
setLocalValue(newVal) // Instant local feedback
setValue(newVal) // Update Payload form state
}
return (
<input
type="text"
value={localValue}
onChange={handleChange}
placeholder={placeholder}
style={{
width: '100%',
padding: '8px',
border: '1px solid var(--theme-elevation-200)',
borderRadius: '3px',
fontSize: '13px',
}}
/>
)
}
Why we need local state: Direct binding to value and setValue works, but we'll add debouncing next (see Performance section) which requires local state for instant UI feedback.
Usage in the table:
{rows.map((row, rowIndex) => (
<tr key={row.id}>
<td>
<TextCell
path={`${path}.${rowIndex}.name`}
placeholder="Enter specification name"
/>
</td>
</tr>
))}
The path pattern: ${path}.${rowIndex}.name constructs the full field path like layout.0.rows.2.name. Payload uses this to track each cell independently.
The Debouncing Problem (Preview)
If you test the component above, you'll notice typing feels sluggish with large tables. Every keystroke triggers a setValue() call, which can trigger validation or auto-save. We'll fix this in the Performance section with a debouncing hook.
Adding CSV Import/Export
Now let's add bulk operations. We'll implement three features:
- Download Template - Headers only, for users to fill in
- Export CSV - Current data as CSV
- Import CSV - Upload and parse CSV files
All client-side, no server needed.
CSV Helper Functions
First, we need utilities for escaping and parsing CSV data (RFC 4180 compliant):
// Escape CSV values (handles commas, quotes, newlines)
const escapeCSV = (str: string) => {
if (str === null || str === undefined) return ''
const stringValue = String(str)
if (stringValue.includes(',') || stringValue.includes('"') || stringValue.includes('\n')) {
return `"${stringValue.replace(/"/g, '""')}"`
}
return stringValue
}
// Parse CSV line respecting quoted values
const parseCSVLine = (line: string): string[] => {
const result = []
let startValueIndex = 0
let inQuotes = false
for (let i = 0; i < line.length; i++) {
if (line[i] === '"') {
inQuotes = !inQuotes
} else if (line[i] === ',' && !inQuotes) {
let value = line.substring(startValueIndex, i).trim()
// Remove surrounding quotes and unescape double quotes
if (value.startsWith('"') && value.endsWith('"')) {
value = value.slice(1, -1).replace(/""/g, '"')
}
result.push(value)
startValueIndex = i + 1
}
}
// Push last value
let value = line.substring(startValueIndex).trim()
if (value.startsWith('"') && value.endsWith('"')) {
value = value.slice(1, -1).replace(/""/g, '"')
}
result.push(value)
return result
}
Why custom parsing? JavaScript's String.split(',') doesn't handle quoted values containing commas. These functions correctly parse "Value, with comma",OtherValue.
CSV Export
Generate CSV content and download it using a Blob:
const RowsField: ArrayFieldClientComponent = (props) => {
// ... existing code ...
const generateCSV = (includeData: boolean) => {
const csvRows = ['Specification Name']
if (includeData && rows.length > 0) {
rows.forEach(row => {
csvRows.push(escapeCSV(row.name || ''))
})
}
return csvRows.join('\n')
}
const downloadCSV = (content: string, filename: string) => {
const blob = new Blob([content], { type: 'text/csv;charset=utf-8;' })
const link = document.createElement('a')
if (link.download !== undefined) {
const url = URL.createObjectURL(blob)
link.setAttribute('href', url)
link.setAttribute('download', filename)
link.style.visibility = 'hidden'
document.body.appendChild(link)
link.click()
document.body.removeChild(link)
}
}
const handleDownloadTemplate = (e: React.MouseEvent) => {
e.preventDefault()
const csv = generateCSV(false) // Headers only
downloadCSV(csv, 'specifications_template.csv')
}
const handleExportCSV = (e: React.MouseEvent) => {
e.preventDefault()
const csv = generateCSV(true) // With data
downloadCSV(csv, 'specifications_export.csv')
}
return (
<div>
<Button onClick={handleDownloadTemplate}>Download Template</Button>
<Button onClick={handleExportCSV}>Export CSV</Button>
</div>
)
}
The download pattern: Create a temporary Blob URL, attach it to a hidden anchor element, trigger click, then clean up. This works across all modern browsers.
CSV Import
Read uploaded files with FileReader and parse them:
const RowsField: ArrayFieldClientComponent = (props) => {
const { path } = props
const fileInputRef = useRef<HTMLInputElement>(null)
const { setValue } = useField({ path })
const handleImportClick = (e: React.MouseEvent) => {
e.preventDefault()
fileInputRef.current?.click()
}
const handleFileChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0]
if (!file) return
const reader = new FileReader()
reader.onload = (event) => {
const text = event.target?.result as string
if (!text) return
const lines = text.split(/\r?\n/).filter(line => line.trim() !== '')
if (lines.length < 2) return // Need at least headers and one row
// Skip header row, process data rows
const newRows = []
for (let i = 1; i < lines.length; i++) {
const values = parseCSVLine(lines[i])
if (values.length === 0) continue
newRows.push({
id: generateId(),
name: values[0]
})
}
// Atomic update: append to existing rows
const updatedRows = [...rows, ...newRows]
setValue(updatedRows)
// Reset file input
if (fileInputRef.current) fileInputRef.current.value = ''
}
reader.readAsText(file)
}
return (
<div>
<Button onClick={handleImportClick}>Import CSV</Button>
<input
type="file"
ref={fileInputRef}
style={{ display: 'none' }}
accept=".csv"
onChange={handleFileChange}
/>
</div>
)
}
Why setValue() instead of multiple addFieldRow() calls?
I initially tried calling addFieldRow() for each CSV row with a data parameter. It created empty rows—the data didn't populate. The issue: when using custom components, Payload's internal row creation doesn't reliably hydrate nested data, especially when child inputs use useField() hooks that haven't mounted yet.
The solution: Use setValue() to atomically update the entire array. This bypasses Payload's row creation reducer and directly sets the value, which then flows down to all the useField() hooks in your TextCell components.
Advanced: Multi-Column Tables (VariantsField)
Now for the complex part: a table where columns are dynamically generated from a sibling field.
The Nested Data Challenge
Our variants field has this structure:
{
variants: [
{
id: "v1",
title: "Model FS-5",
sku: "FS5",
variantData: [
{ id: "d1", value: "5mm" }, // Corresponds to rows[0]
{ id: "d2", value: "5000 nits" }, // Corresponds to rows[1]
{ id: "d3", value: "28kg" } // Corresponds to rows[2]
]
}
]
}
The challenge: variantData must have the same length as rows, and we need to render columns based on rows content.
Reading Sibling Fields
Use useFormFields() to read the rows field:
const VariantsField: ArrayFieldClientComponent = (props) => {
const { path } = props
const allFields = useFormFields(([fields]) => fields)
// Construct the path to the sibling 'rows' field
const basePath = path.split('.').slice(0, -1).join('.')
const rowsPath = basePath ? `${basePath}.rows` : 'rows'
// Get rows length
const rowsLengthField = allFields[rowsPath]
const rowsLength = typeof rowsLengthField?.value === 'number'
? rowsLengthField.value
: 0
// Reconstruct rows array
const rows = useMemo(() => {
const reconstructed = []
for (let i = 0; i < rowsLength; i++) {
const idField = allFields[`${rowsPath}.${i}.id`]
const nameField = allFields[`${rowsPath}.${i}.name`]
if (idField || nameField) {
reconstructed.push({
id: String(idField?.value || ''),
name: String(nameField?.value || ''),
})
}
}
return reconstructed
}, [rowsLength, allFields, rowsPath])
// Build column definitions
const specColumns = useMemo(() => {
return rows.map((row) => ({
id: row.id,
name: row.name || 'Untitled',
}))
}, [rows])
return <div>{/* Use specColumns for table headers */}</div>
}
Path construction: If your variants field is at layout.0.variants, the rows field is at layout.0.rows. We split the path, remove the last segment, and append rows.
Auto-Sync Mechanism
When specification rows change, we need to update all variants' variantData arrays to match:
const VariantsField: ArrayFieldClientComponent = (props) => {
const { path } = props
const { addFieldRow, removeFieldRow } = useForm()
// ... existing code to get rows and variants ...
// Auto-sync: When rows.length changes, update all variants
useEffect(() => {
if (variants.length === 0) return
variants.forEach((variant, vIndex) => {
const currentData = variant.variantData || []
const diff = rows.length - currentData.length
const variantDataPath = `${path}.${vIndex}.variantData`
if (diff > 0) {
// Add missing rows
for (let i = 0; i < diff; i++) {
addFieldRow({
path: variantDataPath,
schemaPath: variantDataPath,
})
}
} else if (diff < 0) {
// Remove extra rows from the end
for (let i = 0; i < Math.abs(diff); i++) {
removeFieldRow({
path: variantDataPath,
rowIndex: currentData.length - 1 - i
})
}
}
})
}, [rows.length, variants.length, path, addFieldRow, removeFieldRow])
}
Why this works: We watch rows.length, and whenever it changes, we surgically add or remove items from each variant's variantData array. This keeps them synchronized.
Important: We only watch the length dependency, not the entire rows array. If we watched the full array, this effect would fire on every row name change, causing unnecessary operations.
Rendering Nested Data
Map over variants (rows) and specs (columns):
<table>
<thead>
<tr>
<th>Variant Title</th>
<th>SKU</th>
{specColumns.map((col) => (
<th key={col.id}>{col.name}</th>
))}
<th>Actions</th>
</tr>
</thead>
<tbody>
{variants.map((variant, variantIndex) => (
<tr key={variant.id}>
{/* Title */}
<td>
<TextCell
path={`${path}.${variantIndex}.title`}
placeholder="Enter title"
/>
</td>
{/* SKU */}
<td>
<TextCell
path={`${path}.${variantIndex}.sku`}
placeholder="Enter SKU"
/>
</td>
{/* Spec values */}
{specColumns.map((col, specIndex) => (
<td key={col.id}>
<TextCell
path={`${path}.${variantIndex}.variantData.${specIndex}.value`}
placeholder="—"
/>
</td>
))}
{/* Remove button */}
<td>
<Button onClick={() => handleRemoveVariant(variantIndex)}>
Remove
</Button>
</td>
</tr>
))}
</tbody>
</table>
The nested path: ${path}.${variantIndex}.variantData.${specIndex}.value constructs paths like layout.0.variants.2.variantData.5.value. Each TextCell uses useField() with this path, binding it to Payload's form.
CSV Import for Multi-Column Tables
The pattern is similar, but we need header mapping:
const handleFileChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0]
if (!file) return
const reader = new FileReader()
reader.onload = (event) => {
const text = event.target?.result as string
if (!text) return
const lines = text.split(/\r?\n/).filter(line => line.trim() !== '')
if (lines.length < 2) return
const headers = parseCSVLine(lines[0]).map(h => h.trim())
// Map header names to indices
const titleIndex = headers.findIndex(h => h.toLowerCase().includes('title'))
const skuIndex = headers.findIndex(h => h.toLowerCase() === 'sku')
// Map spec names to CSV column indices
const specIndices = specColumns.map(col => ({
colId: col.id,
colName: col.name,
csvIndex: headers.findIndex(h => h === col.name)
}))
// Clone existing variants
const updatedVariants = [...variants]
// Process data lines
for (let i = 1; i < lines.length; i++) {
const values = parseCSVLine(lines[i])
if (values.length === 0) continue
const newTitle = titleIndex >= 0 ? values[titleIndex] : ''
const newSku = skuIndex >= 0 ? values[skuIndex] : ''
const variantData = specColumns.map((col, index) => {
const mapped = specIndices[index]
const val = mapped.csvIndex >= 0 && mapped.csvIndex < values.length
? values[mapped.csvIndex]
: ''
return {
id: generateId(),
value: val
}
})
updatedVariants.push({
id: generateId(),
title: newTitle,
sku: newSku,
variantData
})
}
// Atomic update
setValue(updatedVariants)
// Reset file input
if (fileInputRef.current) fileInputRef.current.value = ''
}
reader.readAsText(file)
}
Smart mapping: We find CSV columns by name (case-insensitive for "title", exact match for spec names). This allows users to reorder columns in their CSV.
Performance Optimizations
With large tables (100+ cells), performance becomes critical. Here's how to keep the UI responsive.
The Debouncing Pattern
Every keystroke in a TextCell calls setValue(), which can trigger:
- Form validation
- Dirty state updates
- Auto-save network requests
With 168 cells, this creates massive overhead.
Solution: Debounce the setValue() calls with a custom hook:
function useDebouncedCallback<T extends (...args: any[]) => any>(
callback: T,
delay: number
) {
const timeoutRef = useRef<NodeJS.Timeout | null>(null)
return useCallback((...args: Parameters<T>) => {
if (timeoutRef.current) {
clearTimeout(timeoutRef.current)
}
timeoutRef.current = setTimeout(() => {
callback(...args)
}, delay)
}, [callback, delay])
}
Usage in TextCell:
const TextCell = ({ path, placeholder }: { path: string; placeholder?: string }) => {
const { value, setValue } = useField<string>({ path })
const [localValue, setLocalValue] = useState(value || '')
// Sync local value if external value changes
useEffect(() => {
if (value !== undefined && value !== localValue) {
setLocalValue(value)
}
}, [value])
// Debounce the setValue call
const debouncedSetValue = useDebouncedCallback((val: string) => {
setValue(val)
}, 300) // 300ms delay
const handleChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const newVal = e.target.value
setLocalValue(newVal) // Instant local feedback
debouncedSetValue(newVal) // Debounced form update
}
return (
<input
type="text"
value={localValue}
onChange={handleChange}
placeholder={placeholder}
/>
)
}
How it works:
- User types "A" →
localValueupdates instantly (no lag) - Debounce timer starts (300ms)
- User types "B" →
localValueupdates, timer resets - User types "C" →
localValueupdates, timer resets - User stops typing for 300ms →
setValue()finally fires once with "ABC"
Result: Typing feels instant, but Payload only processes one update per word instead of per keystroke.
Why 300ms? It's long enough to batch keystrokes but short enough that users don't notice a delay when clicking away.
Memoization
Prevent unnecessary recalculations with useMemo:
const rows = useMemo(() => {
// Reconstruct rows from form fields
// Only recalculates when dependencies change
}, [rowsValue, allFields, path])
const specColumns = useMemo(() => {
return rows.map((row) => ({
id: row.id,
name: row.name || 'Untitled',
}))
}, [rows])
Without memoization: Every render (caused by any form change) would rebuild these arrays, causing the table to re-render even when data hasn't changed.
With memoization: Arrays are only rebuilt when their dependencies change, preventing unnecessary renders.
Sticky Table Headers and Columns
For large tables, use CSS position: sticky to keep headers and key columns visible:
<th style={{
position: 'sticky',
left: 0,
background: 'var(--theme-elevation-100)',
zIndex: 4, // Higher than regular cells
borderRight: '1px solid var(--theme-elevation-200)'
}}>
Variant Title
</th>
<th style={{
position: 'sticky',
top: 0,
background: 'var(--theme-elevation-100)',
zIndex: 3
}}>
Specification Name
</th>
<td style={{
position: 'sticky',
left: 0,
background: 'var(--theme-elevation-0)', // Match table body background
zIndex: 2
}}>
<TextCell path={`${path}.${variantIndex}.title`} />
</td>
Key points:
- Left sticky: Title and SKU columns stick when scrolling horizontally
- Top sticky: Headers stick when scrolling vertically
- Right sticky: Actions column sticks on the right
- z-index management: Headers > sticky columns > regular cells
- Background colors: Prevent transparency issues during scroll
Registering Your Custom Component
After building the component, you need to tell Payload to use it.
Block/Collection Configuration
In your block or collection config file:
import type { Block } from 'payload'
export const SpecificationComparisonBlock: Block = {
slug: 'specification_comparison_b',
fields: [
{
name: 'rows',
type: 'array',
label: 'Specification Rows',
admin: {
components: {
Field: '/src/components/payload/custom/RowsField',
}
},
fields: [
{
name: 'name',
type: 'text',
label: 'Specification Name',
required: true,
},
],
},
{
name: 'variants',
type: 'array',
label: 'Variants to Compare',
admin: {
components: {
Field: '/src/components/payload/custom/VariantsField',
}
},
fields: [
{
name: 'title',
type: 'text',
label: 'Variant Title',
required: true,
},
{
name: 'sku',
type: 'text',
label: 'SKU',
},
{
name: 'variantData',
type: 'array',
label: 'Specification Values',
fields: [
{
name: 'value',
type: 'text',
label: 'Value',
},
],
},
],
},
],
}
Important notes:
- Import path: Use
/src/components/payload/custom/ComponentName(no.tsxextension) - Keep fields array: The
fieldsarray defines your data schema and is required - Your component only controls UI: The schema still validates and structures data
Generate Import Map
Payload needs to build a module map for your custom components. Run:
pnpm payload generate:importmap
This creates src/app/(payload)/admin/importMap.js with references to your components.
When to regenerate:
- After adding new custom components
- After changing component file paths
- After git pull if teammates added components
Troubleshooting: If your component doesn't appear in the admin UI, regenerate the import map and restart your dev server.
Common Issues and Solutions
Here are the problems you'll likely encounter and how to fix them.
1. Changes Not Persisting
Symptom: You type in an input, but when you save or refresh, the data is gone.
Cause: Your input isn't using useField(), so Payload doesn't know about the changes.
Solution: Every input must use useField():
// ❌ Wrong: Direct state, not bound to Payload
const [value, setValue] = useState('')
<input value={value} onChange={(e) => setValue(e.target.value)} />
// ✅ Correct: Bound to Payload form state
const { value, setValue } = useField({ path: 'myField' })
<input value={value} onChange={(e) => setValue(e.target.value)} />
2. Input Lag While Typing
Symptom: There's a noticeable delay between pressing keys and seeing characters appear.
Cause: Direct setValue() calls trigger heavyweight form operations on every keystroke.
Solution: Implement debouncing as shown in the Performance section. Use local state for immediate feedback and debounce the setValue() call.
3. CSV Import Creates Empty Rows
Symptom: Importing CSV adds the right number of rows, but cells are empty.
Cause: Using addFieldRow({ path, data: {...} }) doesn't reliably populate nested data when child inputs use useField() hooks.
Solution: Use setValue() for bulk operations:
// ❌ Wrong: addFieldRow with data
newRows.forEach(row => {
addFieldRow({ path, data: row })
})
// ✅ Correct: Atomic update with setValue
const updatedRows = [...existingRows, ...newRows]
setValue(updatedRows)
4. Can't Read Sibling Field Values
Symptom: Your component needs data from another field, but it's undefined.
Cause: Incorrect path construction or not using useFormFields().
Solution: Use useFormFields() with correct path:
const allFields = useFormFields(([fields]) => fields)
// If your field is at 'layout.0.variants'
// And you need 'layout.0.rows'
const basePath = path.split('.').slice(0, -1).join('.') // 'layout.0'
const rowsPath = basePath ? `${basePath}.rows` : 'rows'
const rowsField = allFields[rowsPath]
5. Nested Arrays Out of Sync
Symptom: You add a specification row, but variant data doesn't update with a new column.
Cause: No synchronization between the two arrays.
Solution: Implement an auto-sync useEffect as shown in the VariantsField section:
useEffect(() => {
// When rows.length changes, update all variants' variantData
variants.forEach((variant, index) => {
const diff = rows.length - variant.variantData.length
// Add or remove rows to match
})
}, [rows.length]) // Only watch length, not full array
Conclusion
Payload's default array UI works great for simple lists, but it breaks down for tabular data. Custom table-based field components solve this by giving you full control over the UI while deeply integrating with Payload's form system.
What we covered:
- The problem: Nested accordions don't work for comparing data across rows
- The foundation:
useField(),useForm(), anduseFormFields()hooks - Simple tables: Building RowsField with add/remove/edit operations
- CSV operations: Client-side import/export with robust parsing
- Complex tables: VariantsField with dynamic columns and auto-sync
- Performance: Debouncing inputs and memoizing derived data
- Registration: Connecting custom components to Payload's admin
Key learnings:
useField()for every input = Persistence + validation + dirty state tracking- Debouncing = Performance with auto-save forms
setValue()for bulk updates = Reliable whenaddFieldRow()fails with nested data- Auto-sync effects = Maintaining relationships between fields
- Memoization = Preventing unnecessary re-renders
You now have the patterns to build production-grade custom field components for any tabular data in Payload CMS. Whether it's product specifications, pricing tables, feature comparisons, or scheduling grids—the principles are the same.
If you run into issues or have questions, feel free to open an issue in your project repo. Good luck building!
Thanks, Matija
📚 Comprehensive Payload CMS Guides
Detailed Payload guides with field configuration examples, custom components, and workflow optimization tips to speed up your CMS development process.


