JavaScript SEO Checklist with AI Agents
Essential checklist for ensuring JavaScript-rendered content is fully crawlable, indexable, and performant for search.
Overview
JavaScript-heavy sites (React, Vue, Angular, Next.js, Nuxt) face unique SEO challenges. Google can render JavaScript, but it does so in a separate rendering queue that adds delays and introduces failure points. This checklist ensures your JavaScript content is fully accessible to search engines.
Rendering Strategy
- Determine your rendering approach: server-side rendering (SSR), static site generation (SSG), or client-side rendering (CSR)
- If using CSR, evaluate whether critical SEO content can be moved to SSR or SSG
- Verify that SSR/SSG output includes all content visible to users (not just a loading shell)
- Test your rendering by viewing page source (not DevTools Elements) — this shows what Googlebot receives before rendering
Content Accessibility
- Verify that title tags are present in the initial HTML response (not injected by JavaScript)
- Confirm meta descriptions are in the initial HTML
- Check that H1 and other heading tags are in the initial HTML
- Ensure canonical tags are server-rendered, not client-rendered
- Verify structured data (JSON-LD) is present in the initial HTML response
- Confirm that all internal links use standard
<a href>tags, not JavaScript click handlers - Check that all critical text content is in the DOM after rendering (not hidden in JavaScript state)
- Verify that lazy-loaded content uses Intersection Observer and renders into real DOM elements
Crawling and Indexing
- Ensure robots.txt does not block any JavaScript or CSS files needed for rendering
- Check that your JavaScript bundles are cacheable (Googlebot benefits from caching resources)
- Verify your site does not rely on user interactions (clicks, scrolls) to load content
- Confirm that error states (API failures) display meaningful content, not blank pages
- Test that your site works without cookies or local storage (Googlebot does not persist these)
- Verify client-side routing updates the URL and is crawlable (pushState with real href links)
Performance
- Measure Total Blocking Time — long JavaScript execution delays rendering for Googlebot
- Check that JavaScript bundles are code-split and only load what is needed per page
- Verify tree shaking is removing unused code from production bundles
- Ensure polyfills are only served to browsers that need them
- Check that third-party scripts are not blocking critical rendering
Testing
- Use Google’s URL Inspection tool to see how Google renders each page template
- Compare the rendered HTML from Google’s cache against what users see
- Test with JavaScript disabled to understand what content is visible without rendering
- Run Auditite’s JavaScript rendering audit across your site to find pages that fail to render
- Check Google Search Console’s Coverage report for “Discovered - currently not indexed” pages that may indicate rendering issues
- Monitor the time gap between discovery and indexing — JavaScript sites often have longer delays
Common JavaScript SEO Issues
Client-Side-Only Content
Content that requires JavaScript execution to appear in the DOM may not be indexed. Google’s rendering queue can be delayed by days or weeks. Always server-render critical SEO content.
Hash-Based Routing
URLs using # fragments (e.g., example.com/#/page) are not crawlable. Use HTML5 History API with real paths.
API-Dependent Content
If your content depends on API calls that fail or time out, Googlebot sees an empty page. Implement server-side data fetching with proper error fallbacks.
Infinite Scroll Without Pagination
Googlebot cannot scroll. If content is loaded on scroll, provide paginated alternatives with crawlable links.
Dynamic Imports Without Preloading
Dynamically imported JavaScript modules may not load during Googlebot’s rendering window. Preload critical chunks using <link rel="modulepreload">.
Related playbooks
Canonical URL Guide: Automated SEO Workflow
Master canonical tags to prevent duplicate content issues, consolidate link equity, and control which URLs appear in search.
PlaybookCrawl Budget Optimization Playbook with Auditite
Maximize search engine crawl efficiency by directing crawl budget to your most valuable pages and reducing waste.
ChecklistHTTPS Migration Checklist with Auditite
Complete checklist for migrating from HTTP to HTTPS without losing search rankings, traffic, or link equity. Step-by-step guidance.
Stop copy-pasting. Start automating.
Auditite turns playbooks into live audit workflows. Get started to see how.