---
title: "Receive Email Attachments in Your App — Ultimate Guide"
slug: "receive-email-attachments-nextjs-brevo"
published: "2026-02-23"
updated: "2026-04-06"
categories:
  - "Next.js"
tags:
  - "receive email attachments"
  - "inbound email parsing"
  - "Brevo inbound parse"
  - "Next.js email webhook"
  - "PayloadCMS ingest"
  - "email attachment idempotency"
  - "tenant mapping"
  - "webhook bearer token"
  - "Backblaze B2 upload"
  - "OCR pipeline"
  - "normalize email attachments"
llm-intent: "reference"
audience-level: "intermediate"
framework-versions:
  - "brevo"
  - "next.js"
  - "payloadcms"
  - "backblaze b2"
  - "node.js"
status: "stable"
llm-purpose: "Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…"
llm-prereqs:
  - "Access to Brevo"
  - "Access to Next.js"
  - "Access to PayloadCMS"
  - "Access to Backblaze B2"
  - "Access to Node.js"
llm-outputs:
  - "Completed outcome: Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…"
---

**Summary Triples**
- (Brevo inbound parse, forwards parsed emails, to your webhook endpoint as an HTTP POST with attachment metadata and URLs)
- (Webhook endpoint, must be protected, with a bearer token and optional request validation to prevent unauthorized calls)
- (Backend, fetches attachment binaries, from Brevo-provided URLs (or via attachments payload) and uploads them to Backblaze B2 or your canonical storage)
- (Tenant mapping, maps incoming email sender or recipient, to an existing tenant/user so files enter the correct tenant context)
- (Idempotent ingest core, ensures duplicate emails/attachments, do not create duplicate ingestion jobs or documents (use dedupe keys, request IDs, or persistence))
- (Normalization, reuses existing ingest logic, by calling the same ingest core/service used by POST /api/ingest so email files follow the exact pipeline)
- (Ingest result, creates, ingestion-job, document, document-file, pipeline-run, and queues OCR (same as UI uploads))
- (Local testing, use, ngrok (or similar) to expose local webhook to Brevo for end-to-end tests)
- (Error handling, should detect and surface, transient fetch/upload errors, and retry with exponential backoff in the ingest pipeline)

### {GOAL}
Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…

### {PREREQS}
- Access to Brevo
- Access to Next.js
- Access to PayloadCMS
- Access to Backblaze B2
- Access to Node.js

### {STEPS}
1. Configure Brevo inbound parsing
2. Extract a shared ingest core
3. Wire the UI ingest route to core
4. Implement the email webhook endpoint
5. Add tenant inbound token mapping
6. Persist idempotency and metadata
7. Enforce limits and security
8. Deploy and verify end-to-end

<!-- llm:goal="Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…" -->
<!-- llm:prereq="Access to Brevo" -->
<!-- llm:prereq="Access to Next.js" -->
<!-- llm:prereq="Access to PayloadCMS" -->
<!-- llm:prereq="Access to Backblaze B2" -->
<!-- llm:prereq="Access to Node.js" -->
<!-- llm:output="Completed outcome: Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…" -->

# Receive Email Attachments in Your App — Ultimate Guide
> Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…
Matija Žiberna · 2026-02-23

I was building a document-processing app where users could upload files from the UI, and everything worked well through a clean ingest pipeline. Then the practical request came in: users wanted to send invoices and documents directly by email, and have those attachments processed exactly like normal uploads. This guide shows the full implementation path for that flow in a Next.js + PayloadCMS app, using Brevo inbound parsing as the email entry point.

By the end, you will have a working architecture where:

- users send attachments to a dedicated email address,
- Brevo parses the email and calls your webhook,
- your backend fetches the attachment binaries,
- and your app pushes those files through the same ingestion and OCR pipeline as UI uploads.

## The use case and the problem we are solving

Let’s define the exact use case first.

Your app already has a UI uploader in `src/components/upload/FileUploadZone.tsx`. That component sends files to `POST /api/ingest`. The ingest route creates:

- an `ingestion-job`,
- a `document`,
- a `document-file`,
- a `pipeline-run`,
- and then queues OCR.

This is already solid. The issue is that email does not naturally arrive as `multipart/form-data` to your existing UI route. It comes from an external SMTP world, through mail infrastructure, with different trust and parsing requirements.

So the real problem is not “how do I parse an email?” The real problem is:

1. how to securely accept inbound email from a provider,
2. how to normalize email attachments into your internal file ingest model,
3. and how to guarantee email files follow the same business pipeline as UI files.

That third point is critical. If you build a second ingestion path with different behavior, you will create drift and hard-to-debug production inconsistencies.

## The technical model that makes this possible

To receive email attachments in a web app, you need an inbound email provider. In this implementation we use Brevo Inbound Parse:

1. You configure a receiving subdomain like `reply.yourdomain.com`.
2. DNS MX records route incoming email to Brevo.
3. Brevo parses each incoming email and sends structured JSON to your webhook endpoint.
4. JSON includes attachment metadata and `DownloadToken`s.
5. Your app downloads attachment bytes with those tokens and feeds files to your ingest core.

The important architectural decision is this:

You do not call your current `POST /api/ingest` directly from Brevo.

Your current ingest route relies on an authenticated browser session. Brevo won’t have that session cookie. Instead, create a new webhook route and extract common ingest logic into a shared service used by both:

- `POST /api/ingest` (UI path)
- `POST /api/webhooks/email-ingest` (email path)

## Final architecture before implementation

Here is the target shape:

```text
User Email Client
  -> SMTP
  -> Brevo Inbound Parse
  -> POST /api/webhooks/email-ingest
       -> validate auth and tenant
       -> fetch attachment bytes via DownloadToken
       -> call shared ingest service
            -> create ingestion-job, document, document-file, pipeline-run
            -> upload to B2
            -> queue publishOcr
```

And your UI flow remains:

```text
FileUploadZone.tsx -> POST /api/ingest -> shared ingest service -> same pipeline
```

This means both channels converge into one ingest core.

## Step 1: Configure Brevo inbound parsing

Start in Brevo and your DNS provider.

Set a dedicated receiving subdomain, for example `reply.yourdomain.com`, then configure MX records:

- `reply.yourdomain.com` priority `10` -> `inbound1.sendinblue.com.`
- `reply.yourdomain.com` priority `20` -> `inbound2.sendinblue.com.`

Create a Brevo inbound webhook:

- `type: inbound`
- `events: ["inboundEmailProcessed"]`
- `url: https://api.yourapp.com/api/webhooks/email-ingest`
- `domain: reply.yourdomain.com`

Enable secure webhook calls with a bearer token, and make your backend validate it. This is your first security gate.

## Step 2: Extract shared ingest core from `/api/ingest`

Create a reusable service that accepts normalized files and context.

```ts
// File: src/lib/ingest/ingest-core.ts
import crypto from "crypto";
import { v4 as uuidv4 } from "uuid";
import { getPayloadClient } from "@/payload/db";
import { uploadToB2 } from "@/lib/backblaze";

type IngestSource = "pwa" | "api" | "desktop" | "email";

export interface NormalizedIngestFile {
  fileName: string;
  mimeType: string;
  bytes: Buffer;
}

export interface IngestCoreInput {
  tenantId: number;
  source: IngestSource;
  files: NormalizedIngestFile[];
  note?: string;
  emailMeta?: {
    provider: "brevo";
    messageId: string;
    from?: string;
    to?: string[];
    subject?: string;
  };
}

const B2_BUCKET_ID = process.env.B2_BUCKET_ID || "";

export async function ingestFilesCore(input: IngestCoreInput) {
  const payload = await getPayloadClient();

  const ingestionJob = await payload.create({
    collection: "ingestion-jobs",
    overrideAccess: true,
    data: {
      source: input.source,
      note: input.note,
      status: "processing",
      fileCount: input.files.length,
      tenant: input.tenantId,
    },
  });

  const processed: Array<{
    documentId: number;
    pipelineRunId: number;
    originalFilename: string;
  }> = [];
  const errors: Array<{ filename: string; error: string }> = [];

  for (const f of input.files) {
    try {
      const date = new Date().toISOString().split("T")[0];
      const fileId = uuidv4();
      const sanitized = f.fileName.replace(/[^a-zA-Z0-9.-]/g, "_");
      const objectKey = `uploads/${date}/${fileId}/${sanitized}`;
      const sha256 = crypto.createHash("sha256").update(f.bytes).digest("hex");

      await uploadToB2(f.bytes, objectKey, f.mimeType);

      const doc = await payload.create({
        collection: "documents",
        overrideAccess: true,
        data: {
          title: f.fileName,
          ingestionJob: ingestionJob.id,
          status: "ingested",
          tenant: input.tenantId,
        },
      });

      await payload.create({
        collection: "document-files",
        overrideAccess: true,
        data: {
          document: doc.id,
          storageProvider: "backblaze",
          bucket: B2_BUCKET_ID,
          objectKey,
          originalFilename: f.fileName,
          mimeType: f.mimeType,
          fileSize: f.bytes.length,
          sha256,
          state: "intake",
          tenant: input.tenantId,
        },
      });

      const run = await payload.create({
        collection: "pipeline-runs",
        overrideAccess: true,
        data: {
          document: doc.id,
          status: "queued",
          currentStep: "ocr",
          startedAt: new Date().toISOString(),
          tenant: input.tenantId,
        },
      });

      await payload.update({
        collection: "documents",
        id: doc.id,
        overrideAccess: true,
        data: { status: "ocr_pending" },
      });

      await payload.jobs.queue({
        task: "publishOcr",
        input: {
          documentId: doc.id,
          pipelineRunId: run.id,
          objectKey,
          mimeType: f.mimeType,
        },
      });

      processed.push({
        documentId: doc.id,
        pipelineRunId: run.id,
        originalFilename: f.fileName,
      });
    } catch (e) {
      errors.push({ filename: f.fileName, error: String(e) });
    }
  }

  const finalStatus =
    errors.length === input.files.length
      ? "failed"
      : errors.length > 0
      ? "needs_review"
      : "complete";

  await payload.update({
    collection: "ingestion-jobs",
    id: ingestionJob.id,
    overrideAccess: true,
    data: { status: finalStatus },
  });

  return { ingestionJobId: ingestionJob.id, processed, errors };
}
```

This service is the foundation. It gives you one place where all ingest side effects happen. From here on, every channel should call this function instead of duplicating logic.

## Step 3: Make the existing `/api/ingest` route a thin wrapper

Now wire your UI route to the shared core.

```ts
// File: src/app/api/ingest/route.ts
import { NextRequest, NextResponse } from "next/server";
import { headers } from "next/headers";
import type { User } from "@payload-types";
import { getPayloadClient } from "@/payload/db";
import { resolveActiveTenantId } from "@/payload/utilities/tenant-selection";
import { getTenantFromCookie } from "@/payload/utilities/getTenantFromCookie";
import { ingestFilesCore } from "@/lib/ingest/ingest-core";

export async function POST(req: NextRequest) {
  const payload = await getPayloadClient();
  const headersList = await headers();
  const { user } = await payload.auth({ headers: headersList });
  if (!user) return NextResponse.json({ error: "Unauthorized" }, { status: 401 });

  const authenticatedUser = user as User;
  const requestedTenantId = getTenantFromCookie(headersList, "number");
  const tenantId = resolveActiveTenantId(authenticatedUser, requestedTenantId);
  if (!tenantId) return NextResponse.json({ error: "No tenant" }, { status: 403 });

  const formData = await req.formData();
  const files = (formData.getAll("files") as File[]) || [];
  if (!files.length) return NextResponse.json({ error: "No files provided" }, { status: 400 });

  const normalized = await Promise.all(
    files.map(async (f) => ({
      fileName: f.name,
      mimeType: f.type,
      bytes: Buffer.from(await f.arrayBuffer()),
    })),
  );

  const result = await ingestFilesCore({
    tenantId,
    source: "pwa",
    files: normalized,
  });

  return NextResponse.json({ success: true, ...result });
}
```

This preserves your current UX while removing logic duplication.

## Step 4: Add an email webhook endpoint

This route receives Brevo inbound payload, validates the request, normalizes attachments, and calls the same ingest core.

```ts
// File: src/app/api/webhooks/email-ingest/route.ts
import { NextRequest, NextResponse } from "next/server";
import { ingestFilesCore, type NormalizedIngestFile } from "@/lib/ingest/ingest-core";
import { getPayloadClient } from "@/payload/db";

type BrevoAttachment = {
  Name: string;
  ContentType: string;
  ContentLength: number;
  DownloadToken: string;
};

type BrevoItem = {
  MessageId: string;
  Subject?: string;
  From?: { Address?: string };
  To?: Array<{ Address?: string }>;
  Recipients?: string[];
  Attachments?: BrevoAttachment[];
};

type BrevoPayload = { items?: BrevoItem[] };

function unauthorized() {
  return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
}

function resolveTenantTokenFromRecipient(recipients: string[]): string | null {
  // Example: t-abc123@reply.yourdomain.com -> token "abc123"
  for (const r of recipients) {
    const local = r.split("@")[0] || "";
    if (local.startsWith("t-")) return local.slice(2);
  }
  return null;
}

async function downloadBrevoAttachment(downloadToken: string): Promise<Buffer> {
  const apiKey = process.env.BREVO_API_KEY;
  if (!apiKey) throw new Error("BREVO_API_KEY missing");

  const response = await fetch(
    `https://api.brevo.com/v3/inbound/attachments/${encodeURIComponent(downloadToken)}`,
    {
      method: "GET",
      headers: { "api-key": apiKey },
    },
  );

  if (!response.ok) {
    throw new Error(`Brevo attachment download failed: ${response.status}`);
  }

  const bytes = Buffer.from(await response.arrayBuffer());
  return bytes;
}

export async function POST(req: NextRequest) {
  if (process.env.EMAIL_INGEST_ENABLED !== "true") {
    return NextResponse.json({ error: "Disabled" }, { status: 404 });
  }

  const expectedBearer = process.env.BREVO_WEBHOOK_BEARER_TOKEN;
  const authHeader = req.headers.get("authorization");
  if (!expectedBearer || authHeader !== `Bearer ${expectedBearer}`) {
    return unauthorized();
  }

  const payload = (await req.json()) as BrevoPayload;
  const item = payload.items?.[0];
  if (!item) return NextResponse.json({ error: "No inbound item" }, { status: 400 });

  const recipients = [
    ...(item.Recipients || []),
    ...((item.To || []).map((m) => m.Address || "").filter(Boolean)),
  ];
  const tenantToken = resolveTenantTokenFromRecipient(recipients);
  if (!tenantToken) return NextResponse.json({ error: "Cannot resolve tenant" }, { status: 422 });

  const cms = await getPayloadClient();
  const tenantLookup = await cms.find({
    collection: "tenants",
    where: { inboundEmailToken: { equals: tenantToken } },
    limit: 1,
  });
  const tenant = tenantLookup.docs[0];
  if (!tenant) return NextResponse.json({ error: "Unknown tenant token" }, { status: 422 });

  // Idempotency check by (tenant, MessageId)
  const dupe = await cms.find({
    collection: "ingestion-jobs",
    where: {
      and: [
        { tenant: { equals: tenant.id } },
        { externalMessageId: { equals: item.MessageId || "" } },
      ],
    },
    limit: 1,
  });
  if (dupe.totalDocs > 0) return NextResponse.json({ success: true, duplicated: true });

  const maxAttachments = Number(process.env.EMAIL_INGEST_MAX_ATTACHMENTS || "20");
  const maxTotalBytes = Number(process.env.EMAIL_INGEST_MAX_TOTAL_BYTES || `${35 * 1024 * 1024}`);
  const allowedTypes = new Set(
    (process.env.EMAIL_INGEST_ALLOWED_MIME_TYPES ||
      "application/pdf,image/png,image/jpeg,image/jpg")
      .split(",")
      .map((s) => s.trim())
      .filter(Boolean),
  );

  const attachments = item.Attachments || [];
  if (attachments.length === 0) {
    return NextResponse.json({ error: "No attachments" }, { status: 422 });
  }
  if (attachments.length > maxAttachments) {
    return NextResponse.json({ error: "Too many attachments" }, { status: 422 });
  }

  const estimatedTotal = attachments.reduce((sum, a) => sum + (a.ContentLength || 0), 0);
  if (estimatedTotal > maxTotalBytes) {
    return NextResponse.json({ error: "Attachments too large" }, { status: 422 });
  }

  const normalized: NormalizedIngestFile[] = [];
  for (const a of attachments) {
    if (!allowedTypes.has(a.ContentType)) continue;
    const bytes = await downloadBrevoAttachment(a.DownloadToken);
    normalized.push({
      fileName: a.Name,
      mimeType: a.ContentType,
      bytes,
    });
  }

  if (!normalized.length) {
    return NextResponse.json({ error: "No allowed attachments found" }, { status: 422 });
  }

  const result = await ingestFilesCore({
    tenantId: tenant.id,
    source: "email",
    files: normalized,
    emailMeta: {
      provider: "brevo",
      messageId: item.MessageId || "",
      from: item.From?.Address,
      to: recipients,
      subject: item.Subject,
    },
  });

  return NextResponse.json({ success: true, ...result });
}
```

This route is doing three jobs: trust boundary enforcement, payload normalization, and orchestration into shared ingest core.

## Step 5: Add tenant token field for recipient mapping

You need a stable mapping between inbound recipient address and tenant.

Add a token field on tenant records and issue addresses like `t-<token>@reply.yourdomain.com`.

```ts
// File: src/payload/collections/tenants.ts
import type { CollectionConfig } from "payload";

export const Tenants: CollectionConfig = {
  slug: "tenants",
  fields: [
    // ...existing fields
    {
      name: "inboundEmailToken",
      type: "text",
      unique: true,
      index: true,
      admin: {
        description:
          "Token used in inbound email address. Example: t-<token>@reply.yourdomain.com",
      },
    },
  ],
};
```

This keeps tenant identity out of obvious addresses and gives you rotation control if a token leaks.

## Step 6: Extend ingestion metadata for idempotency and traceability

To deduplicate by email message and make troubleshooting easy, store external message metadata on ingestion jobs.

```ts
// File: src/payload/collections/ingestion-jobs.ts
import type { CollectionConfig } from "payload";

export const IngestionJobs: CollectionConfig = {
  slug: "ingestion-jobs",
  fields: [
    // ...existing fields
    {
      name: "externalProvider",
      type: "select",
      options: ["brevo", "postmark", "sendgrid", "mailgun"],
      required: false,
    },
    {
      name: "externalMessageId",
      type: "text",
      index: true,
      required: false,
    },
    {
      name: "emailFrom",
      type: "text",
      required: false,
    },
    {
      name: "emailTo",
      type: "array",
      fields: [{ name: "address", type: "text" }],
      required: false,
    },
  ],
};
```

Then in your ingest core, when `emailMeta` exists, persist these values in the created ingestion job. That gives you deterministic dedupe and operational visibility.

## Step 7: Environment configuration

Define runtime configuration in your environment.

```bash
# File: .env.local
EMAIL_INGEST_ENABLED=true

BREVO_WEBHOOK_BEARER_TOKEN=replace_with_long_random_secret
BREVO_API_KEY=replace_with_brevo_api_key

EMAIL_INGEST_MAX_ATTACHMENTS=20
EMAIL_INGEST_MAX_TOTAL_BYTES=36700160
EMAIL_INGEST_ALLOWED_MIME_TYPES=application/pdf,image/png,image/jpeg,image/jpg
```

Keep these separate from UI-upload limits. Email input is an external channel and needs explicit boundaries.

## Step 8: Update Brevo webhook to your production endpoint

Once backend code is deployed:

1. Set webhook URL to `https://api.yourapp.com/api/webhooks/email-ingest`.
2. Confirm bearer auth header is configured in Brevo webhook security.
3. Send a real email to a tenant address like `t-abc123@reply.yourdomain.com`.
4. Confirm records are created in:
   - `ingestion-jobs`
   - `documents`
   - `document-files`
   - `pipeline-runs`
5. Confirm document status transitions to `ocr_pending` and OCR queue is populated.

At this point, you have true channel parity: email and UI both flow through one ingestion behavior.

## Why this implementation works in production terms

This design works because it treats inbound email as an external integration boundary, not as another version of UI upload.

The webhook route is intentionally narrow:

- authenticate webhook request,
- parse provider payload,
- resolve tenant identity,
- enforce attachment constraints,
- call shared ingest core.

Everything else remains in your existing pipeline engine.

That separation lets you evolve providers later without rewriting ingestion. If you move from Brevo to another provider, you mostly rewrite one adapter route, not your document pipeline.

## Conclusion

We solved a practical product problem: users can now send documents by email and still get the same processing flow as standard uploads. The key to this implementation is not email parsing by itself, but architectural convergence through a shared ingest core used by both UI and webhook channels.

You now have a concrete path to implement inbound email attachments safely with tenant mapping, deduplication, and consistent pipeline behavior in a Next.js + Payload stack.

Let me know in the comments if you have questions, and subscribe for more practical development guides.

Thanks, Matija

## LLM Response Snippet
```json
{
  "goal": "Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…",
  "responses": [
    {
      "question": "What does the article \"Receive Email Attachments in Your App — Ultimate Guide\" cover?",
      "answer": "Receive email attachments in your app with Brevo, Next.js and PayloadCMS; secure webhook, tenant mapping, and idempotent ingest core—follow this…"
    }
  ]
}
```