Reliable Background OCR Processing with Inngest and Next.js

Mohamed Almadih

Mohamed Almadih

2/3/2026
4 min read
Reliable Background OCR Processing with Inngest and Next.js

Reliable Background OCR Processing with Inngest and Next.js

Processing files with AI can be unpredictable. When a user uploads ten bank receipts at once, performing OCR on each one sequentially within a single request is a recipe for disaster: you'll hit serverless timeouts, memory limits, and provide a terrible user experience.

In this post, I'll explain how we used Inngest to build a durable, event-driven background processing pipeline.

The Challenge: The Serverless "Wall"

Standard Next.js Server Actions or API routes have strict execution limits (usually 10-60 seconds on platforms like Vercel). If you try to process multiple images using a heavy LLM like Mistral:

  1. Timeouts: The request will likely get killed before the 3rd or 4th image is done.
  2. UX: The user has to wait with a loading spinner for the entire duration.
  3. Reliability: If one image fails, do you roll back the others? What if the network blips mid-way?

Enter Inngest: Event-Driven Next.js

Inngest allows us to trigger "events" that run background functions outside the main request thread. It handles retries, state management, and orchestration automatically.

1. Defining the Event Schema

First, we define what a "process receipt" event looks like. This gives us type safety across our app.

1
// src/inngest/client.ts
2
export const inngest = new Inngest({
3
id: "receipt-processor",
4
schemas: new EventSchemas().fromRecord<{
5
"receipt/process": {
6
data: {
7
userId: string;
8
fileKey: string;
9
type: string;
10
category: string;
11
originalName: string;
12
};
13
};
14
}>(),
15
});

2. The Background Function (The "Worker")

The core logic lives in an Inngest function. We use step.run to break the process into durable steps. If the "process-receipt" step fails, Inngest will retry it without re-downloading the file.

1
// src/inngest/functions/process-receipt.ts
2
export const processReceipt = inngest.createFunction(
3
{ id: "process-receipt" },
4
{ event: "receipt/process" },
5
async ({ event, step }) => {
6
const { userId, fileKey, type, category, originalName } = event.data;
7
8
// Step 1: Download from R2
9
const buffer = await step.run("download-file", async () => {
10
const url = await getSignedUrlForDownload(fileKey);
11
const response = await fetch(url);
12
const arrayBuffer = await response.arrayBuffer();
13
return Buffer.from(arrayBuffer).toString("base64");
14
});
15
16
// Step 2: Perform AI OCR and Save
17
const result = await step.run("process-receipt-ai", async () => {
18
const fileBuffer = Buffer.from(buffer, "base64");
19
return await processReceiptFromBuffer({
20
userId,
21
buffer: fileBuffer,
22
fileKey,
23
type,
24
category,
25
originalName,
26
});
27
});
28
29
return result;
30
}
31
);

3. Triggering in Bulk

When a user performs a bulk upload, we don't process anything immediately. Instead, we upload the files to R2 and then fire off multiple events. This happens almost instantly from the user's perspective.

1
// src/app/(dashboard)/dashboard/actions.ts
2
export async function bulkUploadReceipts(formData: FormData) {
3
// ... auth and limit checks ...
4
5
const uploadPromises = files.map(async (file) => {
6
const buffer = Buffer.from(await file.arrayBuffer());
7
const key = `temp_${crypto.randomUUID()}_${userId}.${extension}`;
8
await uploadFile(buffer, key);
9
10
return { name: "receipt/process", data: { userId, fileKey: key, ... } };
11
});
12
13
const events = await Promise.all(uploadPromises);
14
15
// Send all events to Inngest at once
16
await inngest.send(events);
17
18
return { success: true, message: `Processing ${files.length} receipts in the background...` };
19
}

Why This Wins

  1. Instant Feedback: The user sees a "Processing..." message immediately. They can navigate away or even close the tab.
  2. Parallelism: Inngest can trigger multiple instances of the function in parallel, processing 10 receipts much faster than a sequential loop would.
  3. Durability: If the AI model is down or rate-limited, Inngest will automatically retry with exponential backoff.
  4. Observability: You get a beautiful dashboard to see exactly which step failed and why, with full logs and payload inspection.

Conclusion

By moving heavy AI processing to Inngest, we've made the app significantly more robust and responsive. It's the difference between an application that feels like a toy and one that handles real-world workloads reliably.

Related Posts

Web Share Target API in a Next.js PWA

In modern web development, Progressive Web Apps (PWAs) are closing the gap between web and native applications. One of the most powerful features to achieve this is the..

Server-Side Data Fetching + URL Search Params with nuqs in Next.js

When building dashboards in Next.js, you often need filters such as search bars, dropdowns, and pagination controls...