The standard for controlling AI model access to your website
llms.txt is a proposed standard for providing information to help Large Language Models (LLMs)
use a website at inference time. Similar to how robots.txt
controls web crawler access, llms.txt helps
control and guide AI model access to your content.
/llms.txt on your domainrobots.txt by providing context for allowed contentAccording to the official specification, an llms.txt file should contain:
# My Project
> Brief description of the project and its purpose.
Additional information about the project.
## Documentation
- [Getting Started](https://example.com/docs/getting-started.md): Introduction guide
- [API Reference](https://example.com/docs/api.md): Complete API documentation
## Examples
- [Basic Example](https://example.com/examples/basic.md)
- [Advanced Example](https://example.com/examples/advanced.md)
## Optional
- [Legacy Docs](https://example.com/docs/legacy.md): Older documentation
Help AI models understand your content structure and find relevant information more efficiently.
Guide which content AI models should prioritize and how they should access it.
Improve how search engines and AI tools discover and understand your content.
Prepare your website for the growing importance of AI model interactions.
Ready to create your own llms.txt file? Use our AI Audit tool to check existing files or validate your own implementation.
Learn more: Visit the official llms.txt website for the complete specification and best practices.