Why llms.txt is not yet a ratified standard and what you should know before implementing it
This page previously contained implementation guidance for llms.txt. However, after careful analysis and industry observation, we've updated this guide to explain why llms.txt is not yet ready for production use and why we recommend caution before implementing it on your website.
The concept of llms.txt emerged as a proposed method for websites to communicate with Large Language Models (LLMs) and AI systems. While the idea has generated interest in the AI and SEO communities, it's important to understand that llms.txt is not a ratified standard and lacks the formal governance, industry adoption, and technical validation that established standards possess.
This guide will help you understand what llms.txt is, why it hasn't achieved standard status, the concerns surrounding it, and what proven alternatives you should consider instead.
A proposed plain-text file format that websites could place in their root directory to provide structured information specifically for Large Language Models. The concept is modeled after robots.txt, but designed to help LLMs understand and utilize website content more effectively.
The basic idea behind llms.txt is to create a standardized way for website owners to:
While these goals are reasonable, the implementation and standardization of llms.txt have not progressed in a way that makes it a reliable or recommended approach for AI visibility optimization.
As of October 2025, llms.txt remains in what can best be described as an "informal proposal" stage. Here's the current situation:
| Aspect | Current Status | Comparison to Established Standards |
|---|---|---|
| Governance | No formal standards body | robots.txt, Schema.org have formal governance |
| Industry Adoption | Limited, mostly experimental | Established standards have widespread adoption |
| LLM Platform Support | No official confirmation from major platforms | Standards like Schema.org are officially supported |
| Documentation | Informal, scattered across blogs | Standards have comprehensive official documentation |
| Version Control | No formal versioning | Standards have clear version management |
| Testing Tools | None available from major platforms | Google, Bing provide validation tools for standards |
For a protocol or format to become a "standard" in the technical sense, it typically needs to go through a formal standardization process. This can happen through organizations like:
llms.txt has not been submitted to, reviewed by, or approved by any of these standardization bodies. More importantly:
It's worth noting that robots.txt itself started as an informal convention in 1994. However, it took nearly 30 years and widespread industry adoption before it was formally standardized as RFC 9309 in 2022. The key difference is that robots.txt had near-universal support from search engines before standardization. llms.txt does not have equivalent support from AI platforms.
Beyond the lack of formal standardization, there are several practical concerns about implementing llms.txt on your website:
Without a governing body or authoritative source:
Because there's no authoritative specification, you could implement llms.txt based on one source's recommendations, only to find that major AI platforms (if they ever do support it) expect a different format entirely. This creates technical debt and maintenance burden without guaranteed benefit.
As of October 2025, there is no evidence that major AI platforms actively use llms.txt files:
This means that implementing llms.txt currently provides no guaranteed benefit for AI visibility. You would be creating and maintaining a file that may not be read or used by any AI system.
Several technical concerns make llms.txt problematic even as an informal approach:
Much of what llms.txt aims to accomplish is already achievable through established, well-supported standards:
Creating an llms.txt file means:
Unlike structured data, which you can validate using Google's Rich Results Test or Schema.org's validator:
Instead of implementing an unproven format like llms.txt, focus on these established, well-supported methods for improving AI visibility:
Structured data using Schema.org vocabulary is the most effective way to help AI systems understand your content:
Read our comprehensive guide to structured data implementation
Proper use of HTML5 semantic elements helps AI systems understand content structure:
<article>, <section>, <nav>, <aside> appropriately<main>, <header>, <footer> for page structureAI systems are increasingly sophisticated at understanding natural language content:
Control how AI systems and crawlers access your content using established protocols:
Help AI crawlers discover and prioritize your content:
All of the alternatives listed above are proven, standardized approaches with confirmed support from major platforms. They provide measurable benefits and have extensive documentation, tools, and community support. In contrast, llms.txt remains experimental with no confirmed benefits.
Based on our analysis and current industry status, we do not recommend implementing llms.txt at this time for the following reasons:
| Factor | Assessment |
|---|---|
| Confirmed Benefits | None - no AI platform has confirmed they use llms.txt |
| Standardization | Not standardized, no governing body |
| Industry Support | No commitment from major AI platforms |
| Maintenance Burden | Additional file to maintain without proven benefit |
| Alternative Availability | Proven alternatives (Schema.org, etc.) available |
| Risk | Low risk but zero confirmed benefit |
| ROI | Poor - time better spent on proven methods |
Instead of implementing llms.txt, we recommend you:
We would reconsider our position on llms.txt if any of the following occur:
Until then, llms.txt remains an interesting concept but not a production-ready solution.
While the motivation behind llms.txt is understandable—website owners want to communicate effectively with AI systems—the current state of llms.txt makes it unsuitable for production implementation:
For organizations serious about AI visibility, the most effective approach remains focusing on established standards and best practices:
Connectica's team focuses on established, effective methods for improving AI visibility using proven standards and best practices. We can help you implement Schema.org structured data, optimize your technical SEO, and create content that AI systems understand and reference. Contact us for strategies that actually work.