Enter a website URL to fetch and analyze its robots.txt file.
Enter a URL above to check its robots.txt
Robots.txt is a text file placed in your website root directory that instructs search engine crawlers which pages or files they can or cannot request from your site. It is a fundamental part of technical SEO and helps you control how search engines interact with your website content.
User-agent: *Applies to all search engine crawlersDisallow: /private/Blocks crawlers from the /private/ directoryAllow: /public/Explicitly allows crawling of /public/Sitemap: https://yoursite.com/sitemap.xmlTells search engines where your sitemap is locatedCrawl-delay: 1Sets a 1-second delay between crawler requests