Robots.txt Generator

Create a properly formatted robots.txt file for your website. Control which pages search engines can crawl and index.

Ad Space

Configuration

Bot Rules


robots.txt Output

Configure your rules and click "Generate Robots.txt"

Quick Presets

Ad Space

What Is a Robots.txt File?

A robots.txt file tells search engine crawlers which pages or sections of your site they can or cannot request. It's placed at the root of your website (e.g., https://example.com/robots.txt) and is one of the first files crawlers check before indexing your content.

Key Directives

User-agent: Specifies which crawler the rules apply to. Use * for all crawlers or specify individual bots like Googlebot.

Disallow: Tells crawlers not to access certain paths. Disallow: /admin/ blocks the /admin/ directory.

Allow: Overrides a Disallow rule for specific paths within a blocked directory.

Sitemap: Points crawlers to your XML sitemap for better discovery of your pages.

Blocking AI Crawlers

Many website owners now block AI training crawlers like GPTBot (OpenAI), CCBot (Common Crawl), and others. Use the "Block AI Crawlers" preset to add these rules automatically.