GraphQL Batch Resolvers but in React
The Problem
Imagine you're building a todo list app.
You have a list of todo IDs retrieved somehow, and for each one, you need to fetch the full todo data from your database to display it.
async function TodoList({ ids }: { ids: string[] }) {
return (
<div>
{ids.map((id) => (
<TodoItem key={id} id={id} />
))}
</div>
);
}
async function TodoItem({ id }: { id: string }) {
const todo = await fetchTodoFromDatabase(id); // ❌ Called N times!
return <div>{todo.title}</div>;
}If you render 100 todo items, you make 100 separate database queries. This is the infamous N+1 problem:
- 1 query to get the list of IDs
- N queries to get each item's data
This kills performance and overloads your database.
Of course you may manually optimize it by calling such request once and passing data via props - but this approach does not scale well, think of this - what if you want to display list of todos in one place, and single todo in second, and how such optimization and prop drilling will look like
How GraphQL Solves It
In GraphQL world to solve exact same problem there is well known DataLoader that:
- Collects all individual requests made within a single tick of the event loop
- Batches them into one bulk request
- Distributes results back to each caller
// Instead of 100 queries like:
// SELECT * FROM todos WHERE id = '1'
// SELECT * FROM todos WHERE id = '2'
// ...
// DataLoader batches into ONE query:
// SELECT * FROM todos WHERE id IN ('1', '2', '3', ...)React Server Components in Nextjs
And technically we can do the same in nextjs
To do this we need to have singleton dataloader per request, sl all calls will land into it
To do so we can use react cache() function that memoizes results per request.
Combined with DataLoader, we get automatic batching across all components in a single render.
How to
Step 1: Install DataLoader
npm install dataloaderStep 2: Create a Cached Loader
Create a file for your data loading logic (e.g., app/todo/demo.ts):
import DataLoader from "dataloader";
import { cache } from "react";
type TODO = {
id: string;
title: string;
};
// cache() ensures the same DataLoader instance is used
// for ALL components during a single request
const getTodoLoader = cache(
() =>
new DataLoader<string, TODO>(async (ids) => {
// This function receives ALL ids requested in this render cycle
console.log("BATCH:", ids); // e.g., ["1", "2", "3", "4", "5"]
// Make ONE database call for all IDs
const todos = await db.query("SELECT * FROM todos WHERE id IN (?)", [ids]);
// note: database here is used for example - can be anything even fetch calls will work too
// Return results in the same order as input ids
const todoMap = new Map(todos.map((t) => [t.id, t]));
return ids.map((id) => todoMap.get(id) ?? null);
})
);
// Simple API for components to use
export async function getTodoById(id: string) {
// instead of code to retrieve data we pass it to dataloader
return getTodoLoader().load(id);
}Step 3: Use It in Components
Now your components can fetch data by ID without worrying about batching:
import { getTodoById } from "@/app/todo/demo";
async function TodoItem({ id }: { id: string }) {
// Looks like a single fetch, but it's automatically batched!
const todo = await getTodoById(id);
return (
<div>
{id}: {todo?.title ?? "Not found"}
</div>
);
}
async function TodoList({ ids }: { ids: string[] }) {
// no props drilling
return (
<>
{ids.map((id) => (
<TodoItem key={id} id={id} />
))}
</>
);
}
export default function Page() {
return (
<div>
<h1>TODO</h1>
<TodoList ids={["1", "2", "3", "1", "2", "4"]} />
</div>
);
}What Happens Under the Hood
- React starts rendering
TodoList - Six
TodoItemcomponents callgetTodoById()with IDs:"1", "2", "3", "1", "2", "4" - DataLoader collects all these calls within the same tick
- DataLoader deduplicates:
["1", "2", "3", "4"](only unique IDs) - Your batch function runs once with all unique IDs
- Each
TodoItemreceives its data from the batched result
Result: 1 database query instead of 6!
Key Concepts
Why cache()
React cache() function ensures the same DataLoader instance is shared across all components within a single request:
const getTodoLoader = cache(() => new DataLoader(...));Without cache(), each component would create its own DataLoader, and batching wouldn't work.
DataLoader Handles Deduplication
Notice in our example we requested ID "1" three times and ID "2" twice. DataLoader automatically deduplicates these - the batch function only receives unique IDs.
Works with Suspense
This pattern works seamlessly with React Suspense boundaries:
<Suspense fallback="Loading...">
<TodoItem id="1" />
<TodoItem id="5" />
</Suspense>Survives even creaziest layouts
export default function Page() {
return (
<div className="container mx-auto my-0 p-4 space-y-4">
<h1>TODO</h1>
{/* does not matter if we displaying single todo item: */}
<TodoItem id="1" />
{/* or a list of items (note how some id are duplicating): */}
<TodoList ids={["1", "2", "3", "1", "2", "4"]} />
{/* or in suspense */}
<Suspense fallback="loading...">
<TodoItem id="1" />
<TodoItem id="5" />
</Suspense>
{/* we will have single batch with 1, 2, 3, 4, 5 */}
</div>
);
}All items within the same render cycle are still batched together.
Adding More Loaders
To add loaders for other data types, just create another cached factory:
type User = {
id: string;
name: string;
};
const getUserLoader = cache(
() =>
new DataLoader<string, User>(async (ids) => {
const users = await db.query("SELECT * FROM users WHERE id IN (?)", [ids]);
const userMap = new Map(users.map((u) => [u.id, u]));
return ids.map((id) => userMap.get(id) ?? null);
})
);
export async function getUserById(id: string) {
return getUserLoader().load(id);
}Real-World Example with Supabase
import { createClient } from "@/utils/supabase/server";
import DataLoader from "dataloader";
import { cache } from "react";
const getTodoLoader = cache(
() =>
new DataLoader<string, TODO>(async (ids) => {
const supabase = createClient();
const { data, error } = await supabase.from("todos").select("*").in("id", ids);
if (error) throw error;
const map = new Map(data.map((todo) => [todo.id, todo]));
return ids.map((id) => map.get(id) ?? null);
})
);Utility helper
So far, to avoid repetetive code i ended up with following
// discriminate union so our output will be either data or error determined by success property, something similar to how zod works
export type Result<T, E = Error> = { success: true; data: T } | { success: false; error: E };
// self describing data loader config
type DataLoaderConfig<K, V, C = string> = {
batchLoadFn: (keys: readonly K[]) => Promise<V[]>;
cacheKeyFn?: (key: K) => C;
maxBatchSize?: number;
batchScheduleFn?: (callback: () => void) => void;
};
// and finally our helper
export function createDataLoader<K, V, C = string>(config: DataLoaderConfig<K, V, C>) {
const loader = cache(
() =>
new DataLoader<K, V, C>(config.batchLoadFn, {
cacheKeyFn: config.cacheKeyFn,
maxBatchSize: config.maxBatchSize,
batchScheduleFn: config.batchScheduleFn,
})
);
return (key: K) => loader().load(key);
}with this, our actual loaders become as simple as
app/dataloaders/getTodo.ts
import { createDataLoader } from "./createDataLoader";
export type Input = string;
export type Output = Result<{ id: string; title: string }>;
export default createDataLoader<Input, Output>({
batchLoadFn: async (keys) => {
return await db.query("SELECT * FROM todo WHERE id IN (?)", [keys]);
},
});so in our component we may simply
import getTodo from "@/dataloaders/getTodo";
export default async function TodoItem({ id }: { id: string }) {
const todo = await getTodo(id);
return <div>{todo.title}</div>;
}Ideally, we can infer types instead of manually providing them to createDataLoader helper, also, we may want to ensure batch function is returning expected types
Error handling
There are few approaches, from one side, if our dataloader unable to access database we may just throw an error which will reject all requests
But, at the very end, our bach function should return array or results in same order, which gives us opportunity to return data or error so our components can gracefully handle that - e.g. some of them may fallback and display empty value, for some critical places we can still rethrow error
Batching
In case when we need to restrict amount of simultaneous requests we may utilize maxBatchSize which will slice batches for us into given size, but keep in mind, this one working on scope of concrete request, not globally
Also, there is batchScheduleFn which may be used in rare cases, for example to avoid rate limiting, but once again it will work only in scope of single request
In my case i had helper which i just tuned a little bit to fit well into dataloaders
dataloaders/batchedPromisesSettled.ts
export async function batchedPromisesSettled<R>(tasks: (() => Promise<R>)[], limit = 5): Promise<Result<R>[]> {
const results: Result<R>[] = [];
for (let i = 0; i < tasks.length; i += limit) {
const batch = tasks.slice(i, i + limit);
const batchResults = await Promise.allSettled(batch.map((fn) => fn()));
results.push(...batchResults.map((result) => (result.status === "fulfilled" ? { success: true as const, data: result.value } : { success: false as const, error: result.reason instanceof Error ? result.reason : new Error(String(result.reason)) })));
}
return results;
}but technically it will be the same as using maxBatchSize and Promise.all in batch function