Generate a robots.txt file to guide search engine crawlers.
robots.txt tells crawlers what parts of your site they can access. It’s a crawling directive (not a security feature).
Disallow sensitive routes (like admin areas) and optionally include your sitemap URL.
1. Choose whether to allow crawling.
2. Add disallow paths and an optional sitemap URL.
3. Copy the generated robots.txt content.