Technical SEO Audit Checklist: Everything You Actually Need to Check
I've run hundreds of technical SEO audits over the years. And here's the thing nobody tells you: most audit templates are complete overkill. They list 200+ checkpoints, and teams get paralyzed trying to fix everything at once.
The reality? About 20 checks catch 95% of actual problems. The rest is noise that makes consultants look thorough. So let's cut through the fluff and focus on what actually moves the needle.
Crawlability: Can Google Actually Find Your Stuff?
This is where you start. Always. If Google can't crawl your pages, nothing else matters.
- Check robots.txt - Visit yourdomain.com/robots.txt. You'd be shocked how often someone blocks the entire site by accident. Look for
Disallow: /which blocks everything. Also check for blocked CSS/JS files that prevent rendering. - XML Sitemap - Does it exist? Is it submitted in Search Console? Does it actually contain your important pages? I've seen sitemaps with 50,000 URLs where only 2,000 were indexed because the rest were garbage.
- Crawl errors in Search Console - The Coverage report tells you exactly what Google can't access. Fix 5xx errors first, then 4xx, then soft 404s.
- Internal link depth - Can you reach every important page in 3 clicks from the homepage? If users need 7 clicks, Google probably won't bother crawling that deep.
Indexability: Will Google Keep Your Pages?
Crawling and indexing are different. Google might crawl a page and decide it's not worth keeping. Here's what to check:
- Meta robots tags - Search your codebase for
noindex. You might find it on pages you definitely want indexed. Staging site code making it to production is the usual culprit. - Canonical tags - Every page should have one. Self-referencing canonicals are fine. But watch for pages canonicalizing to the wrong URL or missing canonicals entirely.
- Duplicate content - Run a crawl with Screaming Frog or Sitebulb. Look for pages with identical or near-identical title tags and content. Parameter URLs are often the problem.
- Thin content - Pages with less than 300 words of unique content rarely rank. Filter your crawl by word count and either beef up thin pages or noindex them.
Site Architecture: The Foundation
Poor architecture kills SEO potential. I've seen sites with great content that can't rank because their structure is a mess.
- URL structure - Clean, descriptive URLs. No
?id=12345&cat=7garbage. Use hyphens, not underscores. Keep them short but meaningful. - HTTPS everywhere - Run your whole site through a redirect checker. Mixed content warnings kill trust signals. Every HTTP resource should redirect to HTTPS.
- Mobile-first indexing - Google crawls mobile first. Test your mobile experience. If content is hidden behind tabs or accordions on mobile but visible on desktop, Google might not see it.
- Redirect chains - A redirects to B redirects to C redirects to D. Each hop loses link equity and slows things down. Flatten chains to single redirects.
Page Speed: The Silent Killer
Speed isn't just a ranking factor. Slow sites have higher bounce rates, which tanks your engagement signals.
- Core Web Vitals - Check Search Console's Core Web Vitals report. Focus on LCP (loading), FID/INP (interactivity), and CLS (visual stability). You don't need perfect scores, just "good" thresholds.
- Image optimization - The biggest quick win. Convert to WebP, add width/height attributes, lazy load below-fold images.
- Render-blocking resources - CSS and JS that block the page from displaying. Defer non-critical scripts, inline critical CSS.
Structured Data: Speaking Google's Language
Schema markup isn't required, but it gives you a real edge for rich snippets.
- Test existing markup - Use Google's Rich Results Test on your key page types. Errors here mean you're not getting those fancy search features.
- Implement the basics - Organization, BreadcrumbList, Article/Product/LocalBusiness depending on your site type. Don't go overboard with obscure schema types.
The Audit Process That Actually Works
Here's how I run audits that get results:
Week 1: Run a full crawl. Export everything. Build a prioritized list of issues sorted by impact and effort.
Week 2: Fix critical crawlability issues. These block everything else.
Week 3-4: Address indexability problems. Clean up duplicates, fix canonicals, improve thin content.
Ongoing: Monitor Search Console weekly. New issues pop up constantly, especially on active sites.
Tools You Actually Need
Don't buy 15 SEO tools. You need:
- Screaming Frog - The industry standard crawler. Free up to 500 URLs.
- Google Search Console - Free and essential. This is Google telling you what's wrong.
- PageSpeed Insights - Free performance testing with real user data.
That's it. Anything else is nice-to-have.
Common Mistakes I See Constantly
After years of audits, these problems show up again and again:
- Blocking JavaScript in robots.txt (Google can't render your pages)
- Having both www and non-www versions accessible (duplicate content nightmare)
- Pagination without proper canonicals or rel=prev/next
- Faceted navigation creating millions of indexable parameter URLs
- Soft 404s returning 200 status codes
Run through this checklist quarterly at minimum. Monthly if you're making frequent site changes. Technical SEO isn't a one-time thing. It's maintenance, like changing the oil in your car. Skip it and things break down.
The sites that dominate search results aren't always the ones with the best content. They're the ones with solid technical foundations that let Google actually find, crawl, and understand that content.