How to create robots.txt
Guides titled how to create robots.txt usually enumerate syntax, but teams still get stuck translating policy into prefixes. Use the form below as a worksheet: pick a User-agent, list Allow exceptions first if you need them, then stack Disallow prefixes from broad to narrow, append Sitemap URLs that already return 200, and download the file only after a second reviewer reads the diff.
Validation belongs off-tool: fetch the deployed file over HTTPS, confirm charset is plain UTF-8 text, and run your search engine’s robots tester against representative URLs—including mobile subdomains if they exist. If the tester reports conflicts, adjust longest-match logic by shortening or reordering directives rather than duplicating contradictory lines.
Return to the main SmartFlexa robots.txt generator for long-form FAQs, or browse create robots.txt and robots.txt for SEO for alternate angles on the same editor.
Rules
Presets
Apply a starting pattern—you can edit paths afterward.
robots.txt
User-agent: *
Allow: /
Sitemap: https://example.com/sitemap.xml