Picture this: you fetch a list of items from an API, and for each item you need extra data from a different endpoint. The natural instinct is to loop through the items and fire a request for each one — maybe inside a useEffect, maybe with an await inside a .map. The code works. The data shows up. But if you look at the network tab, every request waits for the one before it, and your loading spinner sits there for seconds longer than it should.
This is sequential fetching of independent data, and it is one of the easiest performance wins you can grab in any frontend app.
👉 Try it in practice: Rick & Morty: Double Fetching
The trap
Imagine you are building a page that displays characters from the Rick & Morty API. The characters come from https://rickandmortyapi.com/api/character/?page=1, and for each character you also want to show the title of their first episode. That episode title lives at character.episode[0] — a separate URL.
Here is the code that feels right but is wrong:
const fetchCharacters = async () => {
const res = await fetch(`https://rickandmortyapi.com/api/character/?page=1`);
const data = await res.json();
const characters = data.results;
for (const character of characters) {
const episodeRes = await fetch(character.episode[0]);
const episodeData = await episodeRes.json();
character.episodeTitle = episodeData.name;
}
return characters;
};
It reads like a recipe: get all characters, then for each character, get their episode. Step by step. But here is the problem: episode 2's data does not depend on episode 1's data. There is zero reason for request 2 to wait for request 1 to finish. And yet, with await inside a for loop, that is exactly what happens.
If you have 20 characters and each episode request takes 200ms, the episode fetching alone takes 4 seconds — 20 requests × 200ms — when it could take 200ms total.
The same trap shows up in other forms. A useEffect that awaits one fetch, then another, then another. Three independent requests that could fire simultaneously, queued up as if they share a dependency. Every await that isn't necessary is a waterfall you are paying for with user time.
The fix: Promise.all
The solution is Promise.all. Instead of awaiting episode requests one by one, fire all of them at once and wait for the group:
const fetchCharacters = async () => {
const res = await fetch(`https://rickandmortyapi.com/api/character/?page=1`);
const data = await res.json();
const characters = data.results;
const episodePromises = characters.map((character) =>
fetch(character.episode[0]).then((res) => res.json()),
);
const episodes = await Promise.all(episodePromises);
return characters.map((character, i) => ({
...character,
episodeTitle: episodes[i].name,
}));
};
Now all 20 episode requests leave the browser at roughly the same time. The total wait is the slowest individual request, not the sum of all of them. For 20 episodes at 200ms each, you go from 4 seconds to about 200ms. That is not a micro-optimization — that is the difference between a sluggish app and a fast one.
Promise.all fails fast: if one request rejects, the whole thing rejects. In a real app you probably don't want a single failed episode to crash the entire character list. A small adjustment handles that gracefully:
const episodes = await Promise.allSettled(
characters.map((character) =>
fetch(character.episode[0]).then((res) => res.json()),
),
);
const charactersWithEpisodes = characters.map((character, i) => {
const episode = episodes[i];
return {
...character,
episodeTitle: episode.status === "fulfilled" ? episode.value.name : "Unknown",
};
});
Promise.allSettled waits for all promises to complete — whether they succeed or fail — and gives you back the result of each one. One broken request won't take down the rest.
Recognizing the pattern in the wild
Sequential fetching rarely announces itself with a visible for loop. Here are the places it hides:
Chained await in effects. A useEffect that looks like this is a waterfall:
useEffect(() => {
const user = await fetchUser(); // waits
const posts = await fetchPosts(); // waits for user
const settings = await fetchSettings(); // waits for posts
}, []);
If posts and settings don't depend on user, fetch them together:
const [user, posts, settings] = await Promise.all([
fetchUser(),
fetchPosts(),
fetchSettings(),
]);
Nested .then() instead of Promise.all. This:
fetch("/api/products")
.then((products) => {
products.forEach((p) => {
fetch(`/api/reviews/${p.id}`).then((reviews) => {
// one by one, sequentially
});
});
});
Should be:
const products = await fetch("/api/products").then((r) => r.json());
const reviews = await Promise.all(
products.map((p) => fetch(`/api/reviews/${p.id}`).then((r) => r.json())),
);
Accidental serial in server components. A React Server Component that does:
const user = await db.query("SELECT * FROM users WHERE id = $1", [id]);
const orders = await db.query("SELECT * FROM orders WHERE user_id = $1", [id]);
If the database can handle concurrent queries, fire them together:
const [user, orders] = await Promise.all([
db.query("SELECT * FROM users WHERE id = $1", [id]),
db.query("SELECT * FROM orders WHERE user_id = $1", [id]),
]);
When sequential is actually correct
Not everything should be parallel. The rule is simple: if request B needs data from request A, await A first. A search endpoint that returns results from a query, followed by a detail endpoint that needs the ID from the first result — that chain is necessary.
Also, some APIs have rate limits. Sending 100 parallel requests to a third-party API that caps you at 10 per second will get you blocked. In those cases, batch your requests or use a concurrency limiter. But don't pre-optimize for rate limits you don't have. Most internal APIs, public APIs like Rick & Morty, and your own backend can handle parallel requests just fine.
One more thing: loading state
With sequential fetching, you can naively show a loading indicator and it sort of works — the spinner just stays longer than it should. With parallel fetching, you want to be more deliberate about what "loading" means. Show a skeleton or a list of placeholders while the character list loads. Then, as episode titles arrive, fill them in gradually or all at once with Promise.all. The user sees content faster, even if some details take an extra moment.
The important shift is keeping the character list and the episode titles in a single state update. If you set characters first and then update each one with its episode title, you trigger unnecessary re-renders and risk showing incomplete data. Resolve everything together, then set state once:
const [characters, setCharacters] = useState([]);
const [loading, setLoading] = useState(false);
async function load() {
setLoading(true);
const chars = await fetchCharacters(); // fetches + episodes in parallel
setCharacters(chars); // single state update
setLoading(false);
}
This might seem obvious, but it is easy to mess up when the fetches are split across different functions or hooks. Resolve all independent data, then commit to state.
The performance cost of sequential fetching is invisible during development — local APIs respond in milliseconds. But on a slow 3G connection or a cold serverless function start, those waterfalls add up into seconds. Recognizing when data doesn't depend on other data, and fetching it accordingly, is one of the highest-leverage skills you can build as a frontend developer.
👉 Try it in practice: Rick & Morty: Double Fetching — fix sequential episode fetching and see the difference in the network tab.
Top comments (0)