Search tools...

Robots.txt Generator

Generate a valid robots.txt file to control search engine crawling of your website.

Robots.txt Generator

Create a valid robots.txt file to control how search engine crawlers access your site.

Quick Presets:

Group 1

No disallow rules

Sitemap URLs

No sitemaps added

robots.txt Preview
User-agent: *
Allow: /

Robots.txt Syntax Guide

User-agent:
Specifies the crawler. Use * for all bots.
Disallow:
Blocks a path from crawling. Disallow: / blocks the entire site.
Allow:
Overrides a Disallow for a specific sub-path (supported by Google, Bing).
Crawl-delay:
Seconds between requests (honored by Bing/Yandex, ignored by Google).
Sitemap:
Full URL to your XML sitemap. Can list multiple.

Place the file at the root of your domain: https://example.com/robots.txt

Robots.txt Generator कसरी प्रयोग गर्ने

  1. 1

    Choose a quick preset or start from scratch by adding user-agent groups.

  2. 2

    Select a user-agent (*, Googlebot, Bingbot, or custom) for each group.

  3. 3

    Add Allow and Disallow paths using the + buttons.

  4. 4

    Optionally set a Crawl-delay and add Sitemap URLs.

  5. 5

    Review the live preview, then copy to clipboard or download the file.

Robots.txt Generator बारेमा

Create a production-ready robots.txt file with an intuitive visual editor. Add multiple user-agent groups with Allow/Disallow rules, set crawl-delay, and specify sitemap URLs.

Choose from quick presets — Allow All, Block All, Block AI Bots (GPTBot, ChatGPT-User, CCBot, Google-Extended), Standard, or WordPress Default — or build custom rules from scratch. The live preview updates in real time as you configure rules.

Copy the output to your clipboard or download it as a robots.txt file. Built-in validation warns you about conflicting rules, and the syntax guide explains every directive.

Robots.txt Generator बारेमा बारम्बार सोधिने प्रश्नहरू

सम्बन्धित उपकरणहरू