SEO Best Practices for Single-Page Applications

Vadim Kravcenko
Vadim Kravcenko
Oct 25, 2024 · 6 min read

TL;DR: SPAs break search engines by default. Google queues JavaScript pages for rendering — a process that can take hours to days, and AI crawlers (GPTBot, ClaudeBot, PerplexityBot) don't execute JavaScript at all. The fix is server-side rendering via Next.js, Nuxt, Angular SSR, or SvelteKit. This guide covers the exact setup for each framework, with code, testing steps, and the 2026 rendering landscape.

SPAs Have an SEO Problem — and It's Bigger Than You Think

Single page applications load one HTML shell and fill it with JavaScript. The user gets a fast, app-like experience. Googlebot gets an empty <div id="app"></div>.

I've audited hundreds of SPAs through SEOJuice. The pattern repeats: the site looks beautiful in the browser, JavaScript runs perfectly, and search engines see nothing. Or worse — they see content days later, after sitting in Google's rendering queue. The Angular hash URL bug alone has cost clients more organic traffic than any algorithm update I can remember. One ecommerce client running Angular with HashLocationStrategy had 1,200 product pages. Google saw exactly one page — the homepage. Everything after the # was invisible. That's not a subtle ranking drop. That's 60% of their organic revenue, gone, because of a routing configuration that takes two minutes to fix once you know about it.

Here's how Google's rendering pipeline actually works:

  1. Crawl — Googlebot fetches your URL and gets the raw HTML response
  2. Queue — If the page needs JavaScript, it enters a rendering queue
  3. Render — Google's Web Rendering Service (a headless Chromium instance) executes JavaScript
  4. Index — The rendered HTML is parsed for content and links

That queue between step 2 and step 3? It can take anywhere from seconds to days. Your competitors with server-rendered HTML skip the queue entirely — their content is indexed on the first crawl.

But it gets worse. In 2025, Vercel analyzed over a billion crawler requests and found that most AI crawlers don't execute JavaScript at all. GPTBot, ClaudeBot, PerplexityBot — they consume raw, unrendered HTML. An analysis of over half a billion GPTBot fetches found zero evidence of JavaScript execution. If your SPA relies on client-side rendering, your content is invisible to the systems that power ChatGPT, Claude, and Perplexity search.

"If your Next.js site ships critical pages as JavaScript-dependent SPAs, those pages are inaccessible to the systems shaping how people discover information."

— Vercel Engineering Blog (source)

This isn't a niche problem. If you want your content discovered by both Google and AI search engines in 2026, server-side rendering is no longer optional — it's a prerequisite.

CSR vs SSR vs SSG vs ISR — The Rendering Decision

There are four ways to render a single page application. The choice determines whether search engines see your content immediately, eventually, or never.

ApproachHow It WorksSEO ImpactTime to First ByteBest For
CSR (Client-Side Rendering)Browser downloads empty HTML shell, JavaScript builds the pageBad — Google queues for rendering, AI crawlers see nothingFast (but empty)Internal dashboards, admin panels, authenticated pages
SSR (Server-Side Rendering)Server generates full HTML per request, sends complete pageExcellent — full content on first crawlVariable (server processing time)Dynamic content: e-commerce, news, user-generated content
SSG (Static Site Generation)Pages pre-built at deploy time as static HTMLExcellent — fastest load, fully crawlableFastest (served from CDN)Blogs, docs, marketing pages, landing pages
ISR (Incremental Static Regeneration)Static pages that regenerate on schedule or on-demandExcellent — SSG speed with fresh contentFast (cached with background rebuild)Large sites with periodically changing content (product catalogs, listings)

Key Takeaway

Any page you want Google or AI search engines to rank must be server-rendered or pre-rendered. CSR is fine for authenticated pages and dashboards. For everything public-facing, use SSR, SSG, or ISR.

The rule is simple: if a URL needs to rank in search, it needs to deliver complete HTML on the initial response. No exceptions. No "but Google renders JavaScript now." Google does — sometimes, eventually, unreliably, and AI crawlers won't even try. I've had this argument with frontend developers more times than I can count. The conversation usually ends when I show them their site in Google's URL Inspection tool and they see a blank page where their beautiful UI should be.

Framework-Specific Guide: React / Next.js

Plain React (Create React App, Vite with React) is CSR-only. Google will struggle with it. Next.js is the answer — and since the App Router became the default in Next.js 13+, the SEO story has improved dramatically.

React Server Components — The 2026 Default

With the App Router, components render on the server by default. You only add 'use client' when a component needs browser APIs or interactivity. This means most of your page content ships as pure HTML — no JavaScript rendering required.

// app/blog/[slug]/page.tsx — Server Component (default)
import { Metadata } from 'next'

// Dynamic metadata for each blog post
export async function generateMetadata({ params }): Promise<Metadata> {
  const post = await getPost(params.slug)
  return {
    title: post.title,
    description: post.excerpt,
    openGraph: {
      title: post.title,
      description: post.excerpt,
      type: 'article',
      publishedTime: post.publishedAt,
    },
    alternates: {
      canonical: `https://example.com/blog/${params.slug}`,
    },
  }
}

// This component runs on the server — zero JS shipped to the browser
export default async function BlogPost({ params }) {
  const post = await getPost(params.slug)

  return (
    <article>
      <h1>{post.title}</h1>
      <div>{post.content}</div>
    </article>
  )
}

Static Generation with generateStaticParams

For content that doesn't change frequently (blog posts, landing pages), pre-render at build time:

// app/blog/[slug]/page.tsx
export async function generateStaticParams() {
  const posts = await getAllPosts()
  return posts.map((post) => ({ slug: post.slug }))
}

// Combined with the page component above,
// Next.js generates static HTML at build time for every post

ISR for Large Catalogs

For e-commerce sites with thousands of products, use ISR to revalidate pages on a schedule:

// app/products/[id]/page.tsx
export const revalidate = 3600 // Regenerate every hour

export default async function ProductPage({ params }) {
  const product = await getProduct(params.id)
  return <ProductDetail product={product} />
}

If you're stuck on plain React and can't migrate to Next.js, use a pre-rendering service like Prerender.io as a stopgap. But understand this is duct tape, not a real fix. Every month you delay migration is a month of suboptimal indexing.

Framework-Specific Guide: Vue / Nuxt 3

Same story, different ecosystem. Plain Vue with Vue Router is CSR-only. Nuxt 3 gives you SSR, SSG, ISR, and edge rendering with Nitro — all in one framework.

Universal Rendering (SSR by Default)

Nuxt 3 uses universal rendering out of the box. No configuration needed — your pages are server-rendered on first load, then hydrated for client-side navigation.

<!-- pages/blog/[slug].vue -->
<script setup>
const route = useRoute()
const { data: post } = await useFetch(`/api/posts/${route.params.slug}`)

// Per-page SEO metadata
useHead({
  title: post.value.title,
  meta: [
    { name: 'description', content: post.value.excerpt },
    { property: 'og:title', content: post.value.title },
    { property: 'og:description', content: post.value.excerpt },
  ],
  link: [
    { rel: 'canonical', href: `https://example.com/blog/${route.params.slug}` }
  ]
})
</script>

<template>
  <article>
    <h1>{{ post.title }}</h1>
    <div v-html="post.content" />
  </article>
</template>

Hybrid Rendering — Different Strategies Per Route

Nuxt 3's routeRules let you mix rendering strategies in the same app. This is powerful for sites that have both static marketing pages and dynamic content:

// nuxt.config.ts
export default defineNuxtConfig({
  routeRules: {
    '/':          { prerender: true },           // SSG — homepage
    '/blog/**':   { isr: 3600 },                 // ISR — blog posts regenerate hourly
    '/products/**': { ssr: true },               // SSR — dynamic product pages
    '/dashboard/**': { ssr: false },             // CSR — authenticated dashboard
  }
})

Edge Rendering with Nitro

Nitro, Nuxt 3's server engine, supports deployment to edge platforms (Cloudflare Workers, Vercel Edge, Netlify Edge). This means your SSR pages render at CDN edge locations closest to the user — sub-50ms TTFB is realistic. For SEO, this means faster page loads, better Core Web Vitals, and more frequent crawling from Google.

Framework-Specific Guide: Angular SSR

Angular has historically been the hardest framework to optimize for SEO. That's changed significantly since Angular 17, which made SSR a first-class citizen with built-in hydration support. No more bolting on Angular Universal as an afterthought.

Setting Up SSR in Angular 17+

New projects get SSR by default when you use the CLI:

# New project with SSR enabled
ng new my-app --ssr

# Add SSR to an existing project
ng add @angular/ssr

Dynamic Meta Tags with Angular's Services

// product-page.component.ts
import { Component, OnInit } from '@angular/core';
import { Meta, Title } from '@angular/platform-browser';
import { ActivatedRoute } from '@angular/router';

@Component({
  selector: 'app-product-page',
  template: `
    <article>
      <h1>{{ product.name }}</h1>
      <p>{{ product.description }}</p>
    </article>
  `
})
export class ProductPageComponent implements OnInit {
  product: any;

  constructor(
    private meta: Meta,
    private title: Title,
    private route: ActivatedRoute,
    private productService: ProductService
  ) {}

  ngOnInit() {
    const id = this.route.snapshot.paramMap.get('id');
    this.productService.getProduct(id).subscribe(product => {
      this.product = product;
      this.title.setTitle(product.name);
      this.meta.updateTag({ name: 'description', content: product.description });
      this.meta.updateTag({ property: 'og:title', content: product.name });
    });
  }
}

Incremental Hydration (Angular 19+)

Angular 19.2 introduced incremental hydration, building on the @defer API. Components render full HTML on the server but only hydrate on the client when triggered (on viewport, on interaction, etc.). This reduces JavaScript shipped to the browser while keeping full SSR content available to crawlers.

Angular-Specific Gotcha

Hash-based routing (HashLocationStrategy) produces URLs like example.com/#/products. Google ignores everything after the #. Your 500-page product catalog looks like one page. Switch to PathLocationStrategy immediately — it's the default in Angular, but some legacy projects still use hash routing. This is the exact bug that cost the ecommerce client I mentioned at the top 60% of their organic traffic. They'd been running with hash routing for two years. Two years of product pages that Google never knew existed. The fix took 20 minutes. The recovery took four months.

Framework-Specific Guide: SvelteKit

SvelteKit has quietly become one of the best frameworks for SEO out of the box. SSR is enabled by default — you'd have to actively opt out of it. The framework compiles away the framework, shipping minimal JavaScript compared to React or Angular.

Default SSR — No Configuration Needed

Every SvelteKit page is server-rendered by default. Your +page.svelte files generate complete HTML on the first request:

<!-- src/routes/blog/[slug]/+page.svelte -->
<script>
  export let data;
</script>

<svelte:head>
  <title>{data.post.title}</title>
  <meta name="description" content={data.post.excerpt} />
  <link rel="canonical" href={`https://example.com/blog/${data.post.slug}`} />
</svelte:head>

<article>
  <h1>{data.post.title}</h1>
  {@html data.post.content}
</article>
// src/routes/blog/[slug]/+page.server.ts
export async function load({ params }) {
  const post = await getPost(params.slug);
  return { post };
}

Hybrid Rendering Per Route

Like Nuxt, SvelteKit lets you control rendering per route:

// src/routes/blog/[slug]/+page.ts
export const prerender = true;  // SSG — generate at build time

// src/routes/dashboard/+page.ts
export const ssr = false;       // CSR-only for authenticated content

Why SvelteKit deserves attention for SEO: The compiler eliminates the framework runtime. A SvelteKit page ships a fraction of the JavaScript that an equivalent Next.js or Nuxt page does. Less JavaScript means faster LCP, better INP scores, and happier Core Web Vitals — all ranking signals.

The SPA SEO Checklist

Every SPA needs to pass these checks. Ordered by impact — fix the top items first.

#CheckWhy It MattersHow to Fix
1SSR or SSG enabled for all public pagesWithout it, Google queues your pages and AI crawlers see nothingNext.js App Router, Nuxt 3, Angular SSR, or SvelteKit
2Clean URLs (no hash fragments)Hash URLs (#/page) are invisible to search enginesUse history-based routing — the default in all modern frameworks
3Unique title tag per pageSame title on every route means Google picks one and ignores the restgenerateMetadata (Next.js), useHead() (Nuxt), Meta service (Angular), <svelte:head>
4Unique meta description per pageControls your snippet in search resultsSame libraries as title tags — set per route
5Canonical tags on every pagePrevents duplicate content issues from query params and trailing slashesAdd <link rel="canonical"> per page in your metadata
6Proper heading hierarchy (H1 > H2 > H3)Signals content structure to crawlersOne H1 per page, logical nesting in components
7Internal links use real <a href> tagsJavaScript-only navigation (onClick handlers) blocks crawlersUse framework Link components: <Link>, <NuxtLink>, routerLink
8XML sitemap submittedHelps Google discover pages not reachable through linksGenerate with next-sitemap, nuxt-simple-sitemap, or framework-specific tooling
9Code splitting and lazy loadingReduces initial JS bundle, improves LCP and INPDynamic imports (next/dynamic, defineAsyncComponent, @defer)
10Schema markup (JSON-LD) on key pagesEnables rich snippets and helps AI crawlers understand contentJSON-LD in <head> or <body> — server-rendered, not injected via JS

Common SPA SEO Mistakes — With Code Examples

These are the mistakes I see on nearly every SPA audit. Every single one kills rankings.

Mistake 1: Relying on CSR for Public Pages

"But Google renders JavaScript!" — yes, sometimes, eventually, if your JS doesn't throw an error, and if the rendering queue isn't backed up, and if your JavaScript doesn't time out, and if none of your third-party scripts interfere with the render. That's a lot of ifs to bet your organic traffic on. And AI crawlers won't even try.

// WRONG — CSR-only React app
// Google sees: <div id="root"></div>
import { createRoot } from 'react-dom/client';
const root = createRoot(document.getElementById('root'));
root.render(<App />);

// RIGHT — Next.js Server Component
// Google sees: fully rendered HTML with all content
export default async function Page() {
  const data = await fetchData();
  return <Article data={data} />;
}

Mistake 2: Hash-Based Routing

Angular's HashLocationStrategy and Vue Router's hash mode produce URLs like example.com/#/products/shoes. Google ignores everything after the #. Your entire application looks like one page.

// WRONG — Vue Router hash mode
const router = createRouter({
  history: createWebHashHistory(),  // URLs: example.com/#/about
  routes
})

// RIGHT — Vue Router history mode
const router = createRouter({
  history: createWebHistory(),      // URLs: example.com/about
  routes
})

Mistake 3: Same Title Tag on Every Page

SPAs ship with one <title> in the HTML shell. If you don't dynamically update it per route, every page has the same title. Google will index one version and ignore the rest.

<!-- WRONG — static HTML shell with one title -->
<!DOCTYPE html>
<html>
  <head>
    <title>My App</title>  <!-- Same for every route -->
  </head>
  <body><div id="app"></div></body>
</html>

<!-- RIGHT — dynamic metadata per route (Nuxt example) -->
<script setup>
useHead({
  title: `${product.name} | MyStore`,
  meta: [{ name: 'description', content: product.summary }]
})
</script>

Mistake 4: JavaScript Errors Blocking Rendering

A single uncaught error in your application can prevent the entire page from rendering. Google's Web Rendering Service won't retry. The page stays empty in the index.

I see this constantly: a third-party analytics script throws an error, or a missing API response crashes the component tree. The result? Google indexes a blank page. One client had a Hotjar snippet that occasionally failed on Google's renderer. Their product pages would randomly show as blank in GSC's URL Inspection tool. The fix was adding an error boundary around the analytics initialization. Took 10 minutes. Had been silently harming their indexing for months.

Fix: Add error boundaries in React (ErrorBoundary components), use NuxtErrorBoundary in Nuxt, and always test with JavaScript disabled (more on this below).

Mistake 5: No Sitemap

SPAs with client-side routing don't expose their URL structure through HTML links the way multi-page sites do. Without a sitemap, Google discovers pages only through crawlable links — and if your internal linking is JavaScript-driven, it might find nothing.

# Install next-sitemap for Next.js
npm install next-sitemap

# next-sitemap.config.js
module.exports = {
  siteUrl: 'https://example.com',
  generateRobotsTxt: true,
  changefreq: 'weekly',
  // Exclude routes that shouldn't be indexed
  exclude: ['/dashboard/*', '/api/*', '/admin/*'],
}

Mistake 6: Blocking JavaScript in robots.txt

If your robots.txt blocks CSS or JS files, Google's renderer can't build your page. We see this surprisingly often — usually a leftover default configuration.

# WRONG — blocking JS and CSS
User-agent: *
Disallow: /static/js/
Disallow: /static/css/

# RIGHT — allow all resources needed for rendering
User-agent: *
Disallow: /api/
Disallow: /dashboard/
Allow: /static/

How to Test Your SPA

Google Lighthouse performance audit report showing scores for Performance, Accessibility, Best Practices, and SEO
A Lighthouse performance audit report. JavaScript-heavy SPAs often score lower on performance metrics due to client-side rendering overhead. Source: Shopify
's SEO — Step by Step

Don't assume your SPA is crawlable because it works in Chrome. Here's how to verify what search engines actually see.

Test 1: Google Search Console URL Inspection

The URL Inspection Tool is your single best diagnostic. Enter any URL and click "Test Live URL." Google will fetch your page, render it with its Web Rendering Service, and show you the result.

What to look for:

  • Screenshot tab — Does it show your actual content, or a blank page / loading spinner?
  • Rendered HTML tab — Is your content in the DOM? Search for key text strings.
  • More Info tab — Check for JavaScript console errors. Any uncaught errors mean rendering may have failed.
  • Page resources — Verify that no critical JS/CSS files are blocked or returning errors.

Test 4: Run a Full SEO Audit

Our free SEO audit tool scans your SPA for rendering issues, missing meta tags, broken links, and accessibility problems. It shows you exactly what search engines see — no login required, results in 30 seconds.

Test 5: Check AI Crawler Access

Since AI crawlers don't render JavaScript, check whether they can access your content using the AI Crawler Inspector. It simulates how GPTBot, ClaudeBot, and other AI bots see your pages.

SPA SEO in 2026: What Changed

The SPA SEO landscape in 2026 is materially different from even two years ago. Here's what matters.

React Server Components Are the New Default

With Next.js App Router, components render on the server by default. You explicitly opt into client-side rendering with 'use client'. This inverts the old model — instead of everything being CSR with SSR bolted on, everything is SSR with CSR added where needed. The SEO implications are profound: most of your application ships as pure HTML automatically.

Edge Rendering Is Production-Ready

Nuxt 3 with Nitro, Next.js Edge Runtime, and SvelteKit adapters now deploy SSR to CDN edge locations (Cloudflare Workers, Vercel Edge Functions). Your pages render at the edge node closest to the user — or crawler. Sub-50ms TTFB is achievable globally.

Angular Finally Has First-Class SSR

Angular 17+ built SSR into the CLI with ng new --ssr. Angular 19 added incremental hydration. The days of painful Angular Universal setup are over. If you're still running a CSR-only Angular app, the migration path is now straightforward.

AI Crawlers Are a Real SEO Channel

GPTBot, ClaudeBot, PerplexityBot, and others are crawling the web to build training data and power AI search. None of them execute JavaScript. Analysis of over half a billion GPTBot fetches found zero evidence of JS execution. If you want your content cited in AI-generated answers, SSR is the only option.

"Seeing rendered HTML, not just source code, is important in diagnosing if there's an indexing problem. Google can only discover your links if they are <a> HTML elements with an href attribute."

— Martin Splitt, Developer Advocate at Google (source)

Core Web Vitals Still Matter

LCP, INP, and CLS remain ranking signals. SPAs that ship large JavaScript bundles get penalized on LCP (page takes too long to show meaningful content) and INP (interactions feel sluggish). Code splitting, lazy loading, and minimal client-side JavaScript aren't optional — they're performance requirements that directly impact rankings.

"Prerendering your site statically or on the server is the best practice for SEO — it makes crawlers willing to reindex your site more frequently."

— Vercel Engineering Blog (source)

Quick Framework Comparison

Not sure which framework to pick? Here's how they compare for SEO specifically.

FeatureNext.js (React)Nuxt 3 (Vue)Angular SSRSvelteKit
SSR by defaultYes (App Router)YesYes (since v17)Yes
Static generationgenerateStaticParamsprerender: trueng build --prerenderexport const prerender = true
ISR supportBuilt-in (revalidate)Built-in (isr: seconds)Manual via service workerVia adapter-specific config
Edge renderingVercel Edge, CloudflareNitro (multi-platform)LimitedMultiple adapters
Per-page metadata APIgenerateMetadatauseHead()Meta + Title services<svelte:head>
Client JS shippedModerate (RSC helps)ModerateHeavyMinimal (compiled away)
Community/ecosystemLargestStrongEnterprise-focusedGrowing fast

Frequently Asked Questions

Can Google crawl single page applications?

Yes, but with caveats. Google's Web Rendering Service uses a headless Chromium browser to execute JavaScript and render SPAs. However, this rendering happens in a separate queue after the initial crawl, which can delay indexing by hours to days. Pages with JavaScript errors may never render. For reliable indexing, use SSR or SSG — don't rely on Google's rendering queue for time-sensitive content.

Is a single page application bad for SEO?

A CSR-only SPA is bad for SEO. An SPA with server-side rendering is not. The framework doesn't matter — what matters is whether the initial HTML response contains your content. Next.js, Nuxt, Angular SSR, and SvelteKit all produce fully crawlable SPAs. The "SPA vs. SEO" problem was solved years ago; the issue is developers who don't implement the solution.

Do I need server-side rendering for a React app?

For any page that needs to rank in search? Yes. React alone (via Vite or Create React App) is CSR-only — search engines see an empty page. Next.js adds SSR, SSG, and ISR. With React Server Components in the App Router, most components render on the server by default. If your React app serves public-facing content, Next.js is the standard solution.

How do AI search engines handle SPAs?

They don't. GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, and other AI crawlers do not execute JavaScript. They parse raw HTML only. Analysis of over half a billion GPTBot requests found zero evidence of JavaScript rendering. If your content is client-side rendered, it's invisible to AI search engines. SSR is the only way to appear in AI-generated answers and citations.

What's the difference between SSR, SSG, and ISR?

SSR (Server-Side Rendering) generates HTML on each request — best for dynamic content. SSG (Static Site Generation) generates HTML at build time — fastest but requires rebuilds for content changes. ISR (Incremental Static Regeneration) generates static pages that automatically regenerate on a schedule — combines SSG speed with content freshness. All three deliver full HTML to search engines on the first request. Choose based on how often your content changes.

Bottom Line

SPAs aren't inherently bad for SEO. But CSR-only SPAs are, and in 2026, they're invisible to AI search engines too.

The fix has been available for years. Next.js, Nuxt 3, Angular SSR, and SvelteKit all deliver server-rendered HTML out of the box. React Server Components made the default even better — most of your app renders on the server automatically. There's no excuse to ship a public-facing SPA without server rendering.

If you're unsure whether your SPA is crawlable, run a free audit — it takes 30 seconds and shows you exactly what Google sees. For AI crawler visibility, check the AI Crawler Inspector.

Related reads:

Discussion (2 comments)

Business Builder

Business Builder

5 months, 3 weeks

Love this deep dive on SPAs and client-side rendering — the React/Vue/Angular SEO pitfalls were explained really well! 🙌 I migrated a React SPA to hybrid SSR + prerendering and saw a crawl/index uptick in under a week. Please do a tutorial on hydration and sitemap strategies next 🙏

KeywordMaster

KeywordMaster

5 months, 3 weeks

Nice — glad it helped and awesome you saw a quick uptick! I did the same move from CRA SPA → hybrid SSR + prerendering last year and saw similar gains, fwiw.

A few practical tips from that migration that might help for the tutorial you asked for:
- Hydration pitfalls: mismatches usually come from non-deterministic things in render (Date.now(), Math.random(), generated IDs, or useEffect producing visible DOM changes). Fix by moving client-only stuff into useEffect or guarding it (if (typeof window === 'undefined') ...), or use deterministic id libs.
- Streaming/partial hydration: if you’re using React 18/Next, streaming SSR + selective client hydration (islands-ish or client boundary components) reduces TTI without sacrificing SEO — imo worth covering.
- Debugging: curl or fetch the page server-side and compare to what Chrome renders after hydration; React devtools console will show hydration mismatch warnings. Also check Search Console’s “Inspect URL” to see what Googlebot sees.
- Sitemap strategy: generate sitemaps at build for static routes, dynamically for API-driven content (rebuild or incremental), split into sitemap index if >50k URLs, include lastmod, and reference it from robots.txt. For multi-lingual sites include hreflang entries or separate sitemaps per locale.
- Tools I used: Next.js (SSR + static props), next-sitemap for generation, prerender.io for tricky bots, and Search Console + server logs to confirm indexing.

If you want, I can write that hydration + sitemap tutorial — what would you prefer: code-heavy step-by-step for Next.js, or framework-agnostic notes + examples? Any stack specifics (Next/Remix/Vite/Netlify) you’re on?

GrowthHacker23

GrowthHacker23

5 months, 2 weeks

ngl SPAs can rank.

SEOJuice
Stay visible everywhere
Get discovered across Google and AI platforms with research-based optimizations.
Works with any CMS
Automated Internal Links
On-Page SEO Optimizations
Get Started Free

no credit card required