TL;DR: SPAs break search engines by default. Google queues JavaScript pages for rendering — a process that can take hours to days, and AI crawlers (GPTBot, ClaudeBot, PerplexityBot) don't execute JavaScript at all. The fix is server-side rendering via Next.js, Nuxt, Angular SSR, or SvelteKit. This guide covers the exact setup for each framework, with code, testing steps, and the 2026 rendering landscape.
Single page applications load one HTML shell and fill it with JavaScript. The user gets a fast, app-like experience. Googlebot gets an empty <div id="app"></div>.
I've audited hundreds of SPAs through SEOJuice. The pattern repeats: the site looks beautiful in the browser, JavaScript runs perfectly, and search engines see nothing. Or worse — they see content days later, after sitting in Google's rendering queue. The Angular hash URL bug alone has cost clients more organic traffic than any algorithm update I can remember. One ecommerce client running Angular with HashLocationStrategy had 1,200 product pages. Google saw exactly one page — the homepage. Everything after the # was invisible. That's not a subtle ranking drop. That's 60% of their organic revenue, gone, because of a routing configuration that takes two minutes to fix once you know about it.
Here's how Google's rendering pipeline actually works:
That queue between step 2 and step 3? It can take anywhere from seconds to days. Your competitors with server-rendered HTML skip the queue entirely — their content is indexed on the first crawl.
But it gets worse. In 2025, Vercel analyzed over a billion crawler requests and found that most AI crawlers don't execute JavaScript at all. GPTBot, ClaudeBot, PerplexityBot — they consume raw, unrendered HTML. An analysis of over half a billion GPTBot fetches found zero evidence of JavaScript execution. If your SPA relies on client-side rendering, your content is invisible to the systems that power ChatGPT, Claude, and Perplexity search.
"If your Next.js site ships critical pages as JavaScript-dependent SPAs, those pages are inaccessible to the systems shaping how people discover information."
This isn't a niche problem. If you want your content discovered by both Google and AI search engines in 2026, server-side rendering is no longer optional — it's a prerequisite.
There are four ways to render a single page application. The choice determines whether search engines see your content immediately, eventually, or never.
| Approach | How It Works | SEO Impact | Time to First Byte | Best For |
|---|---|---|---|---|
| CSR (Client-Side Rendering) | Browser downloads empty HTML shell, JavaScript builds the page | Bad — Google queues for rendering, AI crawlers see nothing | Fast (but empty) | Internal dashboards, admin panels, authenticated pages |
| SSR (Server-Side Rendering) | Server generates full HTML per request, sends complete page | Excellent — full content on first crawl | Variable (server processing time) | Dynamic content: e-commerce, news, user-generated content |
| SSG (Static Site Generation) | Pages pre-built at deploy time as static HTML | Excellent — fastest load, fully crawlable | Fastest (served from CDN) | Blogs, docs, marketing pages, landing pages |
| ISR (Incremental Static Regeneration) | Static pages that regenerate on schedule or on-demand | Excellent — SSG speed with fresh content | Fast (cached with background rebuild) | Large sites with periodically changing content (product catalogs, listings) |
Key Takeaway
Any page you want Google or AI search engines to rank must be server-rendered or pre-rendered. CSR is fine for authenticated pages and dashboards. For everything public-facing, use SSR, SSG, or ISR.
The rule is simple: if a URL needs to rank in search, it needs to deliver complete HTML on the initial response. No exceptions. No "but Google renders JavaScript now." Google does — sometimes, eventually, unreliably, and AI crawlers won't even try. I've had this argument with frontend developers more times than I can count. The conversation usually ends when I show them their site in Google's URL Inspection tool and they see a blank page where their beautiful UI should be.
Plain React (Create React App, Vite with React) is CSR-only. Google will struggle with it. Next.js is the answer — and since the App Router became the default in Next.js 13+, the SEO story has improved dramatically.
With the App Router, components render on the server by default. You only add 'use client' when a component needs browser APIs or interactivity. This means most of your page content ships as pure HTML — no JavaScript rendering required.
// app/blog/[slug]/page.tsx — Server Component (default)
import { Metadata } from 'next'
// Dynamic metadata for each blog post
export async function generateMetadata({ params }): Promise<Metadata> {
const post = await getPost(params.slug)
return {
title: post.title,
description: post.excerpt,
openGraph: {
title: post.title,
description: post.excerpt,
type: 'article',
publishedTime: post.publishedAt,
},
alternates: {
canonical: `https://example.com/blog/${params.slug}`,
},
}
}
// This component runs on the server — zero JS shipped to the browser
export default async function BlogPost({ params }) {
const post = await getPost(params.slug)
return (
<article>
<h1>{post.title}</h1>
<div>{post.content}</div>
</article>
)
}
For content that doesn't change frequently (blog posts, landing pages), pre-render at build time:
// app/blog/[slug]/page.tsx
export async function generateStaticParams() {
const posts = await getAllPosts()
return posts.map((post) => ({ slug: post.slug }))
}
// Combined with the page component above,
// Next.js generates static HTML at build time for every post
For e-commerce sites with thousands of products, use ISR to revalidate pages on a schedule:
// app/products/[id]/page.tsx
export const revalidate = 3600 // Regenerate every hour
export default async function ProductPage({ params }) {
const product = await getProduct(params.id)
return <ProductDetail product={product} />
}
If you're stuck on plain React and can't migrate to Next.js, use a pre-rendering service like Prerender.io as a stopgap. But understand this is duct tape, not a real fix. Every month you delay migration is a month of suboptimal indexing.
Same story, different ecosystem. Plain Vue with Vue Router is CSR-only. Nuxt 3 gives you SSR, SSG, ISR, and edge rendering with Nitro — all in one framework.
Nuxt 3 uses universal rendering out of the box. No configuration needed — your pages are server-rendered on first load, then hydrated for client-side navigation.
<!-- pages/blog/[slug].vue -->
<script setup>
const route = useRoute()
const { data: post } = await useFetch(`/api/posts/${route.params.slug}`)
// Per-page SEO metadata
useHead({
title: post.value.title,
meta: [
{ name: 'description', content: post.value.excerpt },
{ property: 'og:title', content: post.value.title },
{ property: 'og:description', content: post.value.excerpt },
],
link: [
{ rel: 'canonical', href: `https://example.com/blog/${route.params.slug}` }
]
})
</script>
<template>
<article>
<h1>{{ post.title }}</h1>
<div v-html="post.content" />
</article>
</template>
Nuxt 3's routeRules let you mix rendering strategies in the same app. This is powerful for sites that have both static marketing pages and dynamic content:
// nuxt.config.ts
export default defineNuxtConfig({
routeRules: {
'/': { prerender: true }, // SSG — homepage
'/blog/**': { isr: 3600 }, // ISR — blog posts regenerate hourly
'/products/**': { ssr: true }, // SSR — dynamic product pages
'/dashboard/**': { ssr: false }, // CSR — authenticated dashboard
}
})
Nitro, Nuxt 3's server engine, supports deployment to edge platforms (Cloudflare Workers, Vercel Edge, Netlify Edge). This means your SSR pages render at CDN edge locations closest to the user — sub-50ms TTFB is realistic. For SEO, this means faster page loads, better Core Web Vitals, and more frequent crawling from Google.
Angular has historically been the hardest framework to optimize for SEO. That's changed significantly since Angular 17, which made SSR a first-class citizen with built-in hydration support. No more bolting on Angular Universal as an afterthought.
New projects get SSR by default when you use the CLI:
# New project with SSR enabled
ng new my-app --ssr
# Add SSR to an existing project
ng add @angular/ssr
// product-page.component.ts
import { Component, OnInit } from '@angular/core';
import { Meta, Title } from '@angular/platform-browser';
import { ActivatedRoute } from '@angular/router';
@Component({
selector: 'app-product-page',
template: `
<article>
<h1>{{ product.name }}</h1>
<p>{{ product.description }}</p>
</article>
`
})
export class ProductPageComponent implements OnInit {
product: any;
constructor(
private meta: Meta,
private title: Title,
private route: ActivatedRoute,
private productService: ProductService
) {}
ngOnInit() {
const id = this.route.snapshot.paramMap.get('id');
this.productService.getProduct(id).subscribe(product => {
this.product = product;
this.title.setTitle(product.name);
this.meta.updateTag({ name: 'description', content: product.description });
this.meta.updateTag({ property: 'og:title', content: product.name });
});
}
}
Angular 19.2 introduced incremental hydration, building on the @defer API. Components render full HTML on the server but only hydrate on the client when triggered (on viewport, on interaction, etc.). This reduces JavaScript shipped to the browser while keeping full SSR content available to crawlers.
Angular-Specific Gotcha
Hash-based routing (HashLocationStrategy) produces URLs like example.com/#/products. Google ignores everything after the #. Your 500-page product catalog looks like one page. Switch to PathLocationStrategy immediately — it's the default in Angular, but some legacy projects still use hash routing. This is the exact bug that cost the ecommerce client I mentioned at the top 60% of their organic traffic. They'd been running with hash routing for two years. Two years of product pages that Google never knew existed. The fix took 20 minutes. The recovery took four months.
SvelteKit has quietly become one of the best frameworks for SEO out of the box. SSR is enabled by default — you'd have to actively opt out of it. The framework compiles away the framework, shipping minimal JavaScript compared to React or Angular.
Every SvelteKit page is server-rendered by default. Your +page.svelte files generate complete HTML on the first request:
<!-- src/routes/blog/[slug]/+page.svelte -->
<script>
export let data;
</script>
<svelte:head>
<title>{data.post.title}</title>
<meta name="description" content={data.post.excerpt} />
<link rel="canonical" href={`https://example.com/blog/${data.post.slug}`} />
</svelte:head>
<article>
<h1>{data.post.title}</h1>
{@html data.post.content}
</article>
// src/routes/blog/[slug]/+page.server.ts
export async function load({ params }) {
const post = await getPost(params.slug);
return { post };
}
Like Nuxt, SvelteKit lets you control rendering per route:
// src/routes/blog/[slug]/+page.ts
export const prerender = true; // SSG — generate at build time
// src/routes/dashboard/+page.ts
export const ssr = false; // CSR-only for authenticated content
Why SvelteKit deserves attention for SEO: The compiler eliminates the framework runtime. A SvelteKit page ships a fraction of the JavaScript that an equivalent Next.js or Nuxt page does. Less JavaScript means faster LCP, better INP scores, and happier Core Web Vitals — all ranking signals.
Every SPA needs to pass these checks. Ordered by impact — fix the top items first.
| # | Check | Why It Matters | How to Fix |
|---|---|---|---|
| 1 | SSR or SSG enabled for all public pages | Without it, Google queues your pages and AI crawlers see nothing | Next.js App Router, Nuxt 3, Angular SSR, or SvelteKit |
| 2 | Clean URLs (no hash fragments) | Hash URLs (#/page) are invisible to search engines | Use history-based routing — the default in all modern frameworks |
| 3 | Unique title tag per page | Same title on every route means Google picks one and ignores the rest | generateMetadata (Next.js), useHead() (Nuxt), Meta service (Angular), <svelte:head> |
| 4 | Unique meta description per page | Controls your snippet in search results | Same libraries as title tags — set per route |
| 5 | Canonical tags on every page | Prevents duplicate content issues from query params and trailing slashes | Add <link rel="canonical"> per page in your metadata |
| 6 | Proper heading hierarchy (H1 > H2 > H3) | Signals content structure to crawlers | One H1 per page, logical nesting in components |
| 7 | Internal links use real <a href> tags | JavaScript-only navigation (onClick handlers) blocks crawlers | Use framework Link components: <Link>, <NuxtLink>, routerLink |
| 8 | XML sitemap submitted | Helps Google discover pages not reachable through links | Generate with next-sitemap, nuxt-simple-sitemap, or framework-specific tooling |
| 9 | Code splitting and lazy loading | Reduces initial JS bundle, improves LCP and INP | Dynamic imports (next/dynamic, defineAsyncComponent, @defer) |
| 10 | Schema markup (JSON-LD) on key pages | Enables rich snippets and helps AI crawlers understand content | JSON-LD in <head> or <body> — server-rendered, not injected via JS |
These are the mistakes I see on nearly every SPA audit. Every single one kills rankings.
"But Google renders JavaScript!" — yes, sometimes, eventually, if your JS doesn't throw an error, and if the rendering queue isn't backed up, and if your JavaScript doesn't time out, and if none of your third-party scripts interfere with the render. That's a lot of ifs to bet your organic traffic on. And AI crawlers won't even try.
// WRONG — CSR-only React app
// Google sees: <div id="root"></div>
import { createRoot } from 'react-dom/client';
const root = createRoot(document.getElementById('root'));
root.render(<App />);
// RIGHT — Next.js Server Component
// Google sees: fully rendered HTML with all content
export default async function Page() {
const data = await fetchData();
return <Article data={data} />;
}
Angular's HashLocationStrategy and Vue Router's hash mode produce URLs like example.com/#/products/shoes. Google ignores everything after the #. Your entire application looks like one page.
// WRONG — Vue Router hash mode
const router = createRouter({
history: createWebHashHistory(), // URLs: example.com/#/about
routes
})
// RIGHT — Vue Router history mode
const router = createRouter({
history: createWebHistory(), // URLs: example.com/about
routes
})
SPAs ship with one <title> in the HTML shell. If you don't dynamically update it per route, every page has the same title. Google will index one version and ignore the rest.
<!-- WRONG — static HTML shell with one title -->
<!DOCTYPE html>
<html>
<head>
<title>My App</title> <!-- Same for every route -->
</head>
<body><div id="app"></div></body>
</html>
<!-- RIGHT — dynamic metadata per route (Nuxt example) -->
<script setup>
useHead({
title: `${product.name} | MyStore`,
meta: [{ name: 'description', content: product.summary }]
})
</script>
A single uncaught error in your application can prevent the entire page from rendering. Google's Web Rendering Service won't retry. The page stays empty in the index.
I see this constantly: a third-party analytics script throws an error, or a missing API response crashes the component tree. The result? Google indexes a blank page. One client had a Hotjar snippet that occasionally failed on Google's renderer. Their product pages would randomly show as blank in GSC's URL Inspection tool. The fix was adding an error boundary around the analytics initialization. Took 10 minutes. Had been silently harming their indexing for months.
Fix: Add error boundaries in React (ErrorBoundary components), use NuxtErrorBoundary in Nuxt, and always test with JavaScript disabled (more on this below).
SPAs with client-side routing don't expose their URL structure through HTML links the way multi-page sites do. Without a sitemap, Google discovers pages only through crawlable links — and if your internal linking is JavaScript-driven, it might find nothing.
# Install next-sitemap for Next.js
npm install next-sitemap
# next-sitemap.config.js
module.exports = {
siteUrl: 'https://example.com',
generateRobotsTxt: true,
changefreq: 'weekly',
// Exclude routes that shouldn't be indexed
exclude: ['/dashboard/*', '/api/*', '/admin/*'],
}
If your robots.txt blocks CSS or JS files, Google's renderer can't build your page. We see this surprisingly often — usually a leftover default configuration.
# WRONG — blocking JS and CSS
User-agent: *
Disallow: /static/js/
Disallow: /static/css/
# RIGHT — allow all resources needed for rendering
User-agent: *
Disallow: /api/
Disallow: /dashboard/
Allow: /static/

Don't assume your SPA is crawlable because it works in Chrome. Here's how to verify what search engines actually see.
The URL Inspection Tool is your single best diagnostic. Enter any URL and click "Test Live URL." Google will fetch your page, render it with its Web Rendering Service, and show you the result.
What to look for:
Our free SEO audit tool scans your SPA for rendering issues, missing meta tags, broken links, and accessibility problems. It shows you exactly what search engines see — no login required, results in 30 seconds.
Since AI crawlers don't render JavaScript, check whether they can access your content using the AI Crawler Inspector. It simulates how GPTBot, ClaudeBot, and other AI bots see your pages.
The SPA SEO landscape in 2026 is materially different from even two years ago. Here's what matters.
With Next.js App Router, components render on the server by default. You explicitly opt into client-side rendering with 'use client'. This inverts the old model — instead of everything being CSR with SSR bolted on, everything is SSR with CSR added where needed. The SEO implications are profound: most of your application ships as pure HTML automatically.
Nuxt 3 with Nitro, Next.js Edge Runtime, and SvelteKit adapters now deploy SSR to CDN edge locations (Cloudflare Workers, Vercel Edge Functions). Your pages render at the edge node closest to the user — or crawler. Sub-50ms TTFB is achievable globally.
Angular 17+ built SSR into the CLI with ng new --ssr. Angular 19 added incremental hydration. The days of painful Angular Universal setup are over. If you're still running a CSR-only Angular app, the migration path is now straightforward.
GPTBot, ClaudeBot, PerplexityBot, and others are crawling the web to build training data and power AI search. None of them execute JavaScript. Analysis of over half a billion GPTBot fetches found zero evidence of JS execution. If you want your content cited in AI-generated answers, SSR is the only option.
"Seeing rendered HTML, not just source code, is important in diagnosing if there's an indexing problem. Google can only discover your links if they are
<a>HTML elements with anhrefattribute."
LCP, INP, and CLS remain ranking signals. SPAs that ship large JavaScript bundles get penalized on LCP (page takes too long to show meaningful content) and INP (interactions feel sluggish). Code splitting, lazy loading, and minimal client-side JavaScript aren't optional — they're performance requirements that directly impact rankings.
"Prerendering your site statically or on the server is the best practice for SEO — it makes crawlers willing to reindex your site more frequently."
Not sure which framework to pick? Here's how they compare for SEO specifically.
| Feature | Next.js (React) | Nuxt 3 (Vue) | Angular SSR | SvelteKit |
|---|---|---|---|---|
| SSR by default | Yes (App Router) | Yes | Yes (since v17) | Yes |
| Static generation | generateStaticParams | prerender: true | ng build --prerender | export const prerender = true |
| ISR support | Built-in (revalidate) | Built-in (isr: seconds) | Manual via service worker | Via adapter-specific config |
| Edge rendering | Vercel Edge, Cloudflare | Nitro (multi-platform) | Limited | Multiple adapters |
| Per-page metadata API | generateMetadata | useHead() | Meta + Title services | <svelte:head> |
| Client JS shipped | Moderate (RSC helps) | Moderate | Heavy | Minimal (compiled away) |
| Community/ecosystem | Largest | Strong | Enterprise-focused | Growing fast |
Yes, but with caveats. Google's Web Rendering Service uses a headless Chromium browser to execute JavaScript and render SPAs. However, this rendering happens in a separate queue after the initial crawl, which can delay indexing by hours to days. Pages with JavaScript errors may never render. For reliable indexing, use SSR or SSG — don't rely on Google's rendering queue for time-sensitive content.
A CSR-only SPA is bad for SEO. An SPA with server-side rendering is not. The framework doesn't matter — what matters is whether the initial HTML response contains your content. Next.js, Nuxt, Angular SSR, and SvelteKit all produce fully crawlable SPAs. The "SPA vs. SEO" problem was solved years ago; the issue is developers who don't implement the solution.
For any page that needs to rank in search? Yes. React alone (via Vite or Create React App) is CSR-only — search engines see an empty page. Next.js adds SSR, SSG, and ISR. With React Server Components in the App Router, most components render on the server by default. If your React app serves public-facing content, Next.js is the standard solution.
They don't. GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, and other AI crawlers do not execute JavaScript. They parse raw HTML only. Analysis of over half a billion GPTBot requests found zero evidence of JavaScript rendering. If your content is client-side rendered, it's invisible to AI search engines. SSR is the only way to appear in AI-generated answers and citations.
SSR (Server-Side Rendering) generates HTML on each request — best for dynamic content. SSG (Static Site Generation) generates HTML at build time — fastest but requires rebuilds for content changes. ISR (Incremental Static Regeneration) generates static pages that automatically regenerate on a schedule — combines SSG speed with content freshness. All three deliver full HTML to search engines on the first request. Choose based on how often your content changes.
SPAs aren't inherently bad for SEO. But CSR-only SPAs are, and in 2026, they're invisible to AI search engines too.
The fix has been available for years. Next.js, Nuxt 3, Angular SSR, and SvelteKit all deliver server-rendered HTML out of the box. React Server Components made the default even better — most of your app renders on the server automatically. There's no excuse to ship a public-facing SPA without server rendering.
If you're unsure whether your SPA is crawlable, run a free audit — it takes 30 seconds and shows you exactly what Google sees. For AI crawler visibility, check the AI Crawler Inspector.
Related reads:
Love this deep dive on SPAs and client-side rendering — the React/Vue/Angular SEO pitfalls were explained really well! 🙌 I migrated a React SPA to hybrid SSR + prerendering and saw a crawl/index uptick in under a week. Please do a tutorial on hydration and sitemap strategies next 🙏
Nice — glad it helped and awesome you saw a quick uptick! I did the same move from CRA SPA → hybrid SSR + prerendering last year and saw similar gains, fwiw.
A few practical tips from that migration that might help for the tutorial you asked for:
- Hydration pitfalls: mismatches usually come from non-deterministic things in render (Date.now(), Math.random(), generated IDs, or useEffect producing visible DOM changes). Fix by moving client-only stuff into useEffect or guarding it (if (typeof window === 'undefined') ...), or use deterministic id libs.
- Streaming/partial hydration: if you’re using React 18/Next, streaming SSR + selective client hydration (islands-ish or client boundary components) reduces TTI without sacrificing SEO — imo worth covering.
- Debugging: curl or fetch the page server-side and compare to what Chrome renders after hydration; React devtools console will show hydration mismatch warnings. Also check Search Console’s “Inspect URL” to see what Googlebot sees.
- Sitemap strategy: generate sitemaps at build for static routes, dynamically for API-driven content (rebuild or incremental), split into sitemap index if >50k URLs, include lastmod, and reference it from robots.txt. For multi-lingual sites include hreflang entries or separate sitemaps per locale.
- Tools I used: Next.js (SSR + static props), next-sitemap for generation, prerender.io for tricky bots, and Search Console + server logs to confirm indexing.
If you want, I can write that hydration + sitemap tutorial — what would you prefer: code-heavy step-by-step for Next.js, or framework-agnostic notes + examples? Any stack specifics (Next/Remix/Vite/Netlify) you’re on?
ngl SPAs can rank.
no credit card required