Loading...
Create a standard llms.txt file that tells AI crawlers how to scan your site. The AI version of robots.txt.
# llms.txt - AI Crawler Instructions # Generated by Geo Builder # Last Updated: 2026-01-17 # Site Information Name: My Website Description: Website description Language: en Update-Frequency: weekly # Allowed Paths Allow: /products Allow: /categories Allow: /about # Disallowed Paths Disallow: /admin Disallow: /checkout Disallow: /account # Priority Content # These pages should be indexed first Priority: / # Additional Information # This file follows the llms.txt specification # Learn more: https://llmstxt.org/
llms.txt is a standard file format that tells AI crawlers (ChatGPT, Claude, Gemini, etc.) how to scan your site and which information to prioritize.
Traditional robots.txt is for search engines. llms.txt is for AI systems.
Determines which pages and content should be read first by AI.
You can define private areas that AI should not access.
Fill out the form with your site information
Specify allowed and disallowed paths
Add your priority content
Copy the generated code
Save as llms.txt in your site's root directory
Keep your llms.txt file automatically updated with Geo Builder. File updates automatically as you add products.
Sign Up for Early Access