- Receive Email Attachments in Your App — Ultimate Guide
Receive Email Attachments in Your App — Ultimate Guide
Implement Brevo inbound parse, secure webhooks, and a shared ingest core in Next.js + PayloadCMS to normalize email…

⚡ Next.js Implementation Guides
In-depth Next.js guides covering App Router, RSC, ISR, and deployment. Get code examples, optimization checklists, and prompts to accelerate development.
Related Posts:
I was building a document-processing app where users could upload files from the UI, and everything worked well through a clean ingest pipeline. Then the practical request came in: users wanted to send invoices and documents directly by email, and have those attachments processed exactly like normal uploads. This guide shows the full implementation path for that flow in a Next.js + PayloadCMS app, using Brevo inbound parsing as the email entry point.
By the end, you will have a working architecture where:
- users send attachments to a dedicated email address,
- Brevo parses the email and calls your webhook,
- your backend fetches the attachment binaries,
- and your app pushes those files through the same ingestion and OCR pipeline as UI uploads.
The use case and the problem we are solving
Let’s define the exact use case first.
Your app already has a UI uploader in src/components/upload/FileUploadZone.tsx. That component sends files to POST /api/ingest. The ingest route creates:
- an
ingestion-job, - a
document, - a
document-file, - a
pipeline-run, - and then queues OCR.
This is already solid. The issue is that email does not naturally arrive as multipart/form-data to your existing UI route. It comes from an external SMTP world, through mail infrastructure, with different trust and parsing requirements.
So the real problem is not “how do I parse an email?” The real problem is:
- how to securely accept inbound email from a provider,
- how to normalize email attachments into your internal file ingest model,
- and how to guarantee email files follow the same business pipeline as UI files.
That third point is critical. If you build a second ingestion path with different behavior, you will create drift and hard-to-debug production inconsistencies.
The technical model that makes this possible
To receive email attachments in a web app, you need an inbound email provider. In this implementation we use Brevo Inbound Parse:
- You configure a receiving subdomain like
reply.yourdomain.com. - DNS MX records route incoming email to Brevo.
- Brevo parses each incoming email and sends structured JSON to your webhook endpoint.
- JSON includes attachment metadata and
DownloadTokens. - Your app downloads attachment bytes with those tokens and feeds files to your ingest core.
The important architectural decision is this:
You do not call your current POST /api/ingest directly from Brevo.
Your current ingest route relies on an authenticated browser session. Brevo won’t have that session cookie. Instead, create a new webhook route and extract common ingest logic into a shared service used by both:
POST /api/ingest(UI path)POST /api/webhooks/email-ingest(email path)
Final architecture before implementation
Here is the target shape:
User Email Client -> SMTP -> Brevo Inbound Parse -> POST /api/webhooks/email-ingest -> validate auth and tenant -> fetch attachment bytes via DownloadToken -> call shared ingest service -> create ingestion-job, document, document-file, pipeline-run -> upload to B2 -> queue publishOcr
And your UI flow remains:
FileUploadZone.tsx -> POST /api/ingest -> shared ingest service -> same pipeline
This means both channels converge into one ingest core.
Step 1: Configure Brevo inbound parsing
Start in Brevo and your DNS provider.
Set a dedicated receiving subdomain, for example reply.yourdomain.com, then configure MX records:
reply.yourdomain.compriority10->inbound1.sendinblue.com.reply.yourdomain.compriority20->inbound2.sendinblue.com.
Create a Brevo inbound webhook:
type: inboundevents: ["inboundEmailProcessed"]url: https://api.yourapp.com/api/webhooks/email-ingestdomain: reply.yourdomain.com
Enable secure webhook calls with a bearer token, and make your backend validate it. This is your first security gate.
Step 2: Extract shared ingest core from /api/ingest
Create a reusable service that accepts normalized files and context.
// File: src/lib/ingest/ingest-core.ts
import crypto from "crypto";
import { v4 as uuidv4 } from "uuid";
import { getPayloadClient } from "@/payload/db";
import { uploadToB2 } from "@/lib/backblaze";
type IngestSource = "pwa" | "api" | "desktop" | "email";
export interface NormalizedIngestFile {
fileName: string;
mimeType: string;
bytes: Buffer;
}
export interface IngestCoreInput {
tenantId: number;
source: IngestSource;
files: NormalizedIngestFile[];
note?: string;
emailMeta?: {
provider: "brevo";
messageId: string;
from?: string;
to?: string[];
subject?: string;
};
}
const B2_BUCKET_ID = process.env.B2_BUCKET_ID || "";
export async function ingestFilesCore(input: IngestCoreInput) {
const payload = await getPayloadClient();
const ingestionJob = await payload.create({
collection: "ingestion-jobs",
overrideAccess: true,
data: {
source: input.source,
note: input.note,
status: "processing",
fileCount: input.files.length,
tenant: input.tenantId,
},
});
const processed: Array<{
documentId: number;
pipelineRunId: number;
originalFilename: string;
}> = [];
const errors: Array<{ filename: string; error: string }> = [];
for (const f of input.files) {
try {
const date = new Date().toISOString().split("T")[0];
const fileId = uuidv4();
const sanitized = f.fileName.replace(/[^a-zA-Z0-9.-]/g, "_");
const objectKey = `uploads/${date}/${fileId}/${sanitized}`;
const sha256 = crypto.createHash("sha256").update(f.bytes).digest("hex");
await uploadToB2(f.bytes, objectKey, f.mimeType);
const doc = await payload.create({
collection: "documents",
overrideAccess: true,
data: {
title: f.fileName,
ingestionJob: ingestionJob.id,
status: "ingested",
tenant: input.tenantId,
},
});
await payload.create({
collection: "document-files",
overrideAccess: true,
data: {
document: doc.id,
storageProvider: "backblaze",
bucket: B2_BUCKET_ID,
objectKey,
originalFilename: f.fileName,
mimeType: f.mimeType,
fileSize: f.bytes.length,
sha256,
state: "intake",
tenant: input.tenantId,
},
});
const run = await payload.create({
collection: "pipeline-runs",
overrideAccess: true,
data: {
document: doc.id,
status: "queued",
currentStep: "ocr",
startedAt: new Date().toISOString(),
tenant: input.tenantId,
},
});
await payload.update({
collection: "documents",
id: doc.id,
overrideAccess: true,
data: { status: "ocr_pending" },
});
await payload.jobs.queue({
task: "publishOcr",
input: {
documentId: doc.id,
pipelineRunId: run.id,
objectKey,
mimeType: f.mimeType,
},
});
processed.push({
documentId: doc.id,
pipelineRunId: run.id,
originalFilename: f.fileName,
});
} catch (e) {
errors.push({ filename: f.fileName, error: String(e) });
}
}
const finalStatus =
errors.length === input.files.length
? "failed"
: errors.length > 0
? "needs_review"
: "complete";
await payload.update({
collection: "ingestion-jobs",
id: ingestionJob.id,
overrideAccess: true,
data: { status: finalStatus },
});
return { ingestionJobId: ingestionJob.id, processed, errors };
}
This service is the foundation. It gives you one place where all ingest side effects happen. From here on, every channel should call this function instead of duplicating logic.
Step 3: Make the existing /api/ingest route a thin wrapper
Now wire your UI route to the shared core.
// File: src/app/api/ingest/route.ts
import { NextRequest, NextResponse } from "next/server";
import { headers } from "next/headers";
import type { User } from "@payload-types";
import { getPayloadClient } from "@/payload/db";
import { resolveActiveTenantId } from "@/payload/utilities/tenant-selection";
import { getTenantFromCookie } from "@/payload/utilities/getTenantFromCookie";
import { ingestFilesCore } from "@/lib/ingest/ingest-core";
export async function POST(req: NextRequest) {
const payload = await getPayloadClient();
const headersList = await headers();
const { user } = await payload.auth({ headers: headersList });
if (!user) return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
const authenticatedUser = user as User;
const requestedTenantId = getTenantFromCookie(headersList, "number");
const tenantId = resolveActiveTenantId(authenticatedUser, requestedTenantId);
if (!tenantId) return NextResponse.json({ error: "No tenant" }, { status: 403 });
const formData = await req.formData();
const files = (formData.getAll("files") as File[]) || [];
if (!files.length) return NextResponse.json({ error: "No files provided" }, { status: 400 });
const normalized = await Promise.all(
files.map(async (f) => ({
fileName: f.name,
mimeType: f.type,
bytes: Buffer.from(await f.arrayBuffer()),
})),
);
const result = await ingestFilesCore({
tenantId,
source: "pwa",
files: normalized,
});
return NextResponse.json({ success: true, ...result });
}
This preserves your current UX while removing logic duplication.
Step 4: Add an email webhook endpoint
This route receives Brevo inbound payload, validates the request, normalizes attachments, and calls the same ingest core.
// File: src/app/api/webhooks/email-ingest/route.ts
import { NextRequest, NextResponse } from "next/server";
import { ingestFilesCore, type NormalizedIngestFile } from "@/lib/ingest/ingest-core";
import { getPayloadClient } from "@/payload/db";
type BrevoAttachment = {
Name: string;
ContentType: string;
ContentLength: number;
DownloadToken: string;
};
type BrevoItem = {
MessageId: string;
Subject?: string;
From?: { Address?: string };
To?: Array<{ Address?: string }>;
Recipients?: string[];
Attachments?: BrevoAttachment[];
};
type BrevoPayload = { items?: BrevoItem[] };
function unauthorized() {
return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
}
function resolveTenantTokenFromRecipient(recipients: string[]): string | null {
// Example: t-abc123@reply.yourdomain.com -> token "abc123"
for (const r of recipients) {
const local = r.split("@")[0] || "";
if (local.startsWith("t-")) return local.slice(2);
}
return null;
}
async function downloadBrevoAttachment(downloadToken: string): Promise<Buffer> {
const apiKey = process.env.BREVO_API_KEY;
if (!apiKey) throw new Error("BREVO_API_KEY missing");
const response = await fetch(
`https://api.brevo.com/v3/inbound/attachments/${encodeURIComponent(downloadToken)}`,
{
method: "GET",
headers: { "api-key": apiKey },
},
);
if (!response.ok) {
throw new Error(`Brevo attachment download failed: ${response.status}`);
}
const bytes = Buffer.from(await response.arrayBuffer());
return bytes;
}
export async function POST(req: NextRequest) {
if (process.env.EMAIL_INGEST_ENABLED !== "true") {
return NextResponse.json({ error: "Disabled" }, { status: 404 });
}
const expectedBearer = process.env.BREVO_WEBHOOK_BEARER_TOKEN;
const authHeader = req.headers.get("authorization");
if (!expectedBearer || authHeader !== `Bearer ${expectedBearer}`) {
return unauthorized();
}
const payload = (await req.json()) as BrevoPayload;
const item = payload.items?.[0];
if (!item) return NextResponse.json({ error: "No inbound item" }, { status: 400 });
const recipients = [
...(item.Recipients || []),
...((item.To || []).map((m) => m.Address || "").filter(Boolean)),
];
const tenantToken = resolveTenantTokenFromRecipient(recipients);
if (!tenantToken) return NextResponse.json({ error: "Cannot resolve tenant" }, { status: 422 });
const cms = await getPayloadClient();
const tenantLookup = await cms.find({
collection: "tenants",
where: { inboundEmailToken: { equals: tenantToken } },
limit: 1,
});
const tenant = tenantLookup.docs[0];
if (!tenant) return NextResponse.json({ error: "Unknown tenant token" }, { status: 422 });
// Idempotency check by (tenant, MessageId)
const dupe = await cms.find({
collection: "ingestion-jobs",
where: {
and: [
{ tenant: { equals: tenant.id } },
{ externalMessageId: { equals: item.MessageId || "" } },
],
},
limit: 1,
});
if (dupe.totalDocs > 0) return NextResponse.json({ success: true, duplicated: true });
const maxAttachments = Number(process.env.EMAIL_INGEST_MAX_ATTACHMENTS || "20");
const maxTotalBytes = Number(process.env.EMAIL_INGEST_MAX_TOTAL_BYTES || `${35 * 1024 * 1024}`);
const allowedTypes = new Set(
(process.env.EMAIL_INGEST_ALLOWED_MIME_TYPES ||
"application/pdf,image/png,image/jpeg,image/jpg")
.split(",")
.map((s) => s.trim())
.filter(Boolean),
);
const attachments = item.Attachments || [];
if (attachments.length === 0) {
return NextResponse.json({ error: "No attachments" }, { status: 422 });
}
if (attachments.length > maxAttachments) {
return NextResponse.json({ error: "Too many attachments" }, { status: 422 });
}
const estimatedTotal = attachments.reduce((sum, a) => sum + (a.ContentLength || 0), 0);
if (estimatedTotal > maxTotalBytes) {
return NextResponse.json({ error: "Attachments too large" }, { status: 422 });
}
const normalized: NormalizedIngestFile[] = [];
for (const a of attachments) {
if (!allowedTypes.has(a.ContentType)) continue;
const bytes = await downloadBrevoAttachment(a.DownloadToken);
normalized.push({
fileName: a.Name,
mimeType: a.ContentType,
bytes,
});
}
if (!normalized.length) {
return NextResponse.json({ error: "No allowed attachments found" }, { status: 422 });
}
const result = await ingestFilesCore({
tenantId: tenant.id,
source: "email",
files: normalized,
emailMeta: {
provider: "brevo",
messageId: item.MessageId || "",
from: item.From?.Address,
to: recipients,
subject: item.Subject,
},
});
return NextResponse.json({ success: true, ...result });
}
This route is doing three jobs: trust boundary enforcement, payload normalization, and orchestration into shared ingest core.
Step 5: Add tenant token field for recipient mapping
You need a stable mapping between inbound recipient address and tenant.
Add a token field on tenant records and issue addresses like t-<token>@reply.yourdomain.com.
// File: src/payload/collections/tenants.ts
import type { CollectionConfig } from "payload";
export const Tenants: CollectionConfig = {
slug: "tenants",
fields: [
// ...existing fields
{
name: "inboundEmailToken",
type: "text",
unique: true,
index: true,
admin: {
description:
"Token used in inbound email address. Example: t-<token>@reply.yourdomain.com",
},
},
],
};
This keeps tenant identity out of obvious addresses and gives you rotation control if a token leaks.
Step 6: Extend ingestion metadata for idempotency and traceability
To deduplicate by email message and make troubleshooting easy, store external message metadata on ingestion jobs.
// File: src/payload/collections/ingestion-jobs.ts
import type { CollectionConfig } from "payload";
export const IngestionJobs: CollectionConfig = {
slug: "ingestion-jobs",
fields: [
// ...existing fields
{
name: "externalProvider",
type: "select",
options: ["brevo", "postmark", "sendgrid", "mailgun"],
required: false,
},
{
name: "externalMessageId",
type: "text",
index: true,
required: false,
},
{
name: "emailFrom",
type: "text",
required: false,
},
{
name: "emailTo",
type: "array",
fields: [{ name: "address", type: "text" }],
required: false,
},
],
};
Then in your ingest core, when emailMeta exists, persist these values in the created ingestion job. That gives you deterministic dedupe and operational visibility.
Step 7: Environment configuration
Define runtime configuration in your environment.
# File: .env.local
EMAIL_INGEST_ENABLED=true
BREVO_WEBHOOK_BEARER_TOKEN=replace_with_long_random_secret
BREVO_API_KEY=replace_with_brevo_api_key
EMAIL_INGEST_MAX_ATTACHMENTS=20
EMAIL_INGEST_MAX_TOTAL_BYTES=36700160
EMAIL_INGEST_ALLOWED_MIME_TYPES=application/pdf,image/png,image/jpeg,image/jpg
Keep these separate from UI-upload limits. Email input is an external channel and needs explicit boundaries.
Step 8: Update Brevo webhook to your production endpoint
Once backend code is deployed:
- Set webhook URL to
https://api.yourapp.com/api/webhooks/email-ingest. - Confirm bearer auth header is configured in Brevo webhook security.
- Send a real email to a tenant address like
t-abc123@reply.yourdomain.com. - Confirm records are created in:
ingestion-jobsdocumentsdocument-filespipeline-runs
- Confirm document status transitions to
ocr_pendingand OCR queue is populated.
At this point, you have true channel parity: email and UI both flow through one ingestion behavior.
Why this implementation works in production terms
This design works because it treats inbound email as an external integration boundary, not as another version of UI upload.
The webhook route is intentionally narrow:
- authenticate webhook request,
- parse provider payload,
- resolve tenant identity,
- enforce attachment constraints,
- call shared ingest core.
Everything else remains in your existing pipeline engine.
That separation lets you evolve providers later without rewriting ingestion. If you move from Brevo to another provider, you mostly rewrite one adapter route, not your document pipeline.
Conclusion
We solved a practical product problem: users can now send documents by email and still get the same processing flow as standard uploads. The key to this implementation is not email parsing by itself, but architectural convergence through a shared ingest core used by both UI and webhook channels.
You now have a concrete path to implement inbound email attachments safely with tenant mapping, deduplication, and consistent pipeline behavior in a Next.js + Payload stack.
Let me know in the comments if you have questions, and subscribe for more practical development guides.
Thanks, Matija


