Skip to content

Suspense Investigations

Initially, I had planned to have an exercise in the project where we improve the loading experience using Suspense, showing a loading state while we open and parse the MDX content.

When I actually tried to set it up, however, I discovered that in this particular case, Suspense comes with some pretty hefty trade-offs. In this special bonus lesson, we'll dig into it. 😄

If you'd like, spend a few minutes experimenting yourself. As a starting point, you can create a loading.js file that sits beside the blog post page component, and see what the experience is like. See if you're as surprised as I was!

We'll dig in here:

Video Summary

This video is super visual, exploring a bunch of different loading experiences. It's not really something I can summarize.

Here are the bullet points, if you need a refresher about what's covered in this video:

  • Adding a loading.js component produces a loading state, but it only shows for a brief moment when refreshing the page. This is the case even when throttling the network speed.
  • This is because the async work we're doing in this function, opening and parsing the MDX file, happens very quickly. On average it's only 1-3 milliseconds. And so, the server generates both sets of HTML only a few milliseconds apart, and there isn't really time for the loading state to settle in.
  • Interestingly, we actually do see the loading state for a second or two when using client-side navigation, following an internal link from the blog homepage. I believe this is because Next will “pre-load” loading states, so that they're ready to go.
  • We then look at side-by-side comparisons of this experience, with and without Suspense, on a “Fast 3G” throttle, using the production build. Either way, it takes a little over a second between clicking the link and viewing the content, but the Suspense version shows the loading state for most of this time.
  • Even the non-Suspense version, however, still feels acceptable. Especially for folks who are actually running on a 3G connection, I suspect this blog will feel super quick compared to the other sites they visit. Also, the experience would only get faster as the browser caches the JS bundles. These tests are done with the browser cache disabled.
  • But even if it's a small improvement, surely it's still worth implementing Suspense to gain that small improvement? Why didn't we tackle this in the project?!
  • The “Fast 3G” throttle doesn't tell the whole story. If we bump up the throttle to 15mbps, a typical home speed in North America, the story changes considerably. The Suspense version feels very janky, since we jump from the homepage to the loading state to the final content in well under a second. It's dizzying, and it feels buggy.
  • The non-suspense version, meanwhile, feels smooth and intentional. At this network speed, the click feels almost instantaneous. It's a way smoother experience.
  • And so, what do we do? Which group of internet users do we want to optimize for? As a general rule, I prefer to optimize for low-speed internet users, but in this particular case, the Suspense benefits are so minuscule. Either way, the “Fast 3G” experience feels perfectly acceptable to me. And so the benefit isn't significant enough for me to want to sacrifice the high-speed internet experience.
  • The takeaway: We shouldn't try to fix what isn't broken. The blog performance is already really good. We don't need to use every bell and whistle. Suspense is a wonderful tool with lots of significant use cases, but this isn't one of them.

SEO Considerations

In the video above, we saw how the loading state isn't shown for very long because the async work — loading the blog post and parsing it to get the frontmatter — doesn't take very long.

But what if it did take longer?

As an experiment, I went into file-helpers and I set up an artificial delay:

const delay = (ms) => new Promise((resolve) => (
setTimeout(resolve, ms)
));
export const loadBlogPost = React.cache(
async function loadBlogPost(slug) {
await delay(2000);
const rawContent = await readFile(`/content/${slug}.mdx`);
const { data: frontmatter, content } = matter(rawContent);
return { frontmatter, content };
}
);

This function will now sit and wait for 2 whole seconds before it does the real work.

Curiously, however, we still don't see the loading state for very long, if at all:

Instead of getting 2 seconds of loading state, we get 2 seconds of blank white screen. Then, the loading state whooshes by. I had to cherry-pick this recording—in many cases the loading state isn't even visible for a single second!

Why does this happen? It's because we're calling the loadBlogPost function in the generateMetadata function:

// /src/api/[postSlug]/page.js
export async function generateMetadata({ params }) {
const { frontmatter } = await loadBlogPost(params.postSlug);
return {
title: `${frontmatter.title}${BLOG_TITLE}`,
description: frontmatter.abstract,
};
}
async function BlogPost({ params }) {
const { frontmatter, content } = await loadBlogPost(params.postSlug);
// ✂️ Omitted for brevity
}

It turns out that the loading state can only be shown after the metadata has been calculated. In other words, the loading state is blocked until the generateMetadata function has finished doing its work.

But why?? Wouldn't it be better to send the loading UI while we're generating the metadata?

It turns out that Next is intentionally architected this way to prevent SEO issues from creeping in.

When Google's web crawler lands on our page, it expects the first part of the streamed HTML to include a fully-formed <head> tag. It isn't expecting things like the page <title> to be dynamically altered later on. And so, if we sent an initial loading UI without a page title, there's no guarantee that search engine crawlers would accurately pick up the final title.

This is true for all metadata in the <head>, including other important bits like canonical URLs. Crawlers expect the page content to be loaded in chunks, but not the metadata.

Ultimately, in this particular case, it doesn't really change anything; either way, I don't want to include a loading state for the blog content. But I wanted to mention this, because it's a bit of a surprising gotcha, and it might come up for you in other situations.

There's a bit more information about this in the Next.js “Loading UI” docs.