G7 — Chapter 7 - Technical SEO

Criterion G7: robots.txt File — guide + checklist

PART 1 - Fundamentals Chapter 7 - Technical SEO Keyword : fichier robots.txt

This is typically the kind of detail that prevents conflicting signals.

Criterion G7 — robots.txt File is part of our SEO checklist (335 criteria). Here, you have a practical method to check and fix it — with a concrete example.

What exactly this criterion covers

This is typically the kind of detail that prevents conflicting signals.

G7 — robots.txt File (Chapter 7 - Technical SEO): Allow/Disallow configured, do not block CSS/JS, sitemap declared

Why it matters (SEO + UX)

Why it matters: it is a lever for CTR and perception in SERP. When poorly applied, we often observe: ambiguity (wrong associated query), duplication between pages, or loss of performance on indexing rate.

On volume-generated sites, this criterion also serves as a safeguard: a stable rule prevents 1,000 errors at once.

How to check (step by step)

Approach: tool-assisted test (validator / performance). Recommended tool: Lighthouse.

  1. Open the page in Chrome → DevTools → Performance/Network tab.
  2. Run WebPageTest and note the main weak point.
  3. Check if the problem repeats on templates.

Tip: first isolate 10 representative URLs (top pages + generated pages) before scaling the fix.

How to fix properly

Strategy: repair, re-crawl, and monitor in Search Console.

  • Fix the biggest cost source (images, JS, fonts, cache).
  • Retest, then apply to the template (not page by page).
  • Add a safeguard: weight budget (KB) and CI check if possible.

Then: re-crawl 50–200 URLs, then monitor Search Console for 7–14 days (impressions/CTR/indexing).

Concrete example (illustrative)

Example (illustrative):

  • Context: training page for auto garage in Marseille
  • Before: Lighthouse: 43/100 (heavy JS, unoptimized images).
  • After: Lighthouse: 91/100 (lazy-load, compression, cache).
  • Note: Goal: stabilize INP.

Checklist to tick

  • [ ] Measure before/after
  • [ ] Respects: allow
  • [ ] Improvement on template
  • [ ] No CWV regression
  • [ ] Cache and compression OK
FAQ

Frequently asked questions — G7

What is the most common mistake on “robots.txt File”?

Fixing an isolated page without fixing the template/import: the error returns at the next generation.

Which tool is the fastest for large-scale control?

For this type of criterion, a crawl (e.g. Screaming Frog) plus a targeted check in Lighthouse is generally the fastest combo.

How to prevent this from happening on 10K generated pages?

Freeze an auto-generation rule (title/structure/schema/URLs) plus add automatic control (crawl or test) before production import.

Ready to go from theory to action?

Validate this criterion with an audit, then deepen the method in the Academy.

Audit with the tool → Learn in the Academy →