Create a standardized file to help AI models like ChatGPT and Claude better understand and cite your documentation.
Trusted by 1,000+ marketing teams
0
AI Platforms
0+
Prompts Tested
0+
Brands Analyzed
Optimizing for AI agents is the new SEO. Make your content machine-readable by default.
Provide a single, structured source of truth that all AI models can easily parse and understand.
Improve how AI assistants reference your documentation by giving them clear citation paths.
Direct AI agents to your most important content and away from irrelevant pages.
Get your site ready for the next generation of AI search engines and agents.
Add your project name, summary, and links to your core documentation pages.
See exactly how your llms.txt file looks and edit it in real-time.
Save the file and upload it to the root directory of your website.
Eco-system
Supercharge your AI visibility with our suite of free developer tools.
Check your brand's ranking across ChatGPT, Claude, and Perplexity with a free comprehensive report.
Validate your robots.txt to ensure AI crawlers can properly index your public content.
We're building more tools to help you win in the age of AI search.
An llms.txt file is a markdown file that provides a concise, structured overview of your project for Large Language Models (LLMs). It helps AI assistants understand your content, structure, and key resources better.
You should place the llms.txt file in the root directory of your website (e.g., https://yourdomain.com/llms.txt), similar to a robots.txt file.
It uses standard Markdown format. The generated file follows the best practices for readability by AI models, using clear headers and lists for documentation links.
No, robots.txt controls crawling permissions. The llms.txt file is complementary—it provides context and guidance for AI models that *do* crawl your site, helping them give better answers about your content.