Amid a contentious industry debate over the future of web standards in an AI-driven world, a comprehensive new study provides the first empirical evidence to resolve a key question: does implementing an llms.txt file actually influence a website’s performance? This proposed standard, intended to direct how Large Language Models crawl and utilize online content, has been hailed by some as a crucial piece of future-proof infrastructure while being dismissed by many seasoned SEO professionals as a speculative and inconsequential distraction. To move beyond the polarized opinions and anecdotal claims, researchers conducted a controlled, data-driven analysis to determine if this new file format delivers any measurable impact on a site’s visibility, crawl frequency, or referral traffic from major AI platforms. The findings challenge the prevailing hype and offer a clear verdict on where businesses should be focusing their resources for success in the evolving digital landscape.
The Study’s Design and Unambiguous Findings
To provide a definitive answer grounded in data, a rigorous 180-day research study was meticulously designed and executed. The methodology involved tracking 10 websites from a diverse range of sectors—including finance, B2B SaaS, e-commerce, insurance, and pet care—to ensure the findings would be broadly applicable. For the initial 90 days, the research team established a comprehensive performance baseline, carefully monitoring key metrics such as the frequency of AI crawler visits and the volume of referral traffic from leading AI models like ChatGPT, Claude, Perplexity, and Gemini. Following this observation period, an llms.txt file was implemented on each of the 10 sites. The team then continued to track the exact same metrics for an additional 90 days, allowing for a direct, apples-to-apples comparison to determine if the presence of the file had any discernible effect on performance, either positive or negative. This structured approach was designed to isolate the impact of llms.txt from other variables.
The results of the extensive study were not just clear; they were starkly conclusive. For eight of the 10 websites involved, the implementation of an llms.txt file produced no measurable change in either AI traffic or crawler activity. Their performance metrics remained statistically flat when comparing the 90-day period before and after the file was added. One participating site did experience a notable traffic decline of 19.7%, but a deeper analysis quickly revealed this was part of a much broader, site-wide downturn that affected all traffic channels equally, indicating that the llms.txt file had no influence on this trend. Most intriguingly, two sites in the study saw significant increases in AI traffic, with gains of 25% and 12.5%, respectively. While these initially appeared to be success stories, a detailed investigation into the causal factors proved that the llms.txt file was merely a bystander to their growth, not its driver.
Uncovering the Real Drivers of AI Traffic Growth
The most compelling part of the study involved deconstructing the apparent success of the two sites that saw traffic growth, revealing that their gains were attributable to fundamental strategic initiatives rather than the new text file. The first case, a digital bank that saw a 25% surge in AI-driven traffic, had its growth period coincide with several powerful, concurrent activities. These included a major public relations campaign that successfully secured high-authority coverage in publications like Bloomberg, which significantly boosted its credibility. At the same time, the bank executed a complete restructuring of its product pages to include easily extractable comparison tables for interest rates and fees, created a dozen new FAQ pages optimized for data extraction, rebuilt its entire resource center, and resolved critical technical SEO issues that had previously hindered crawling. The study concluded that this potent mix of enhanced authority, strategically structured content, and improved technical health was the true engine of its growth.
Similarly, the B2B SaaS platform that experienced a 12.5% increase in AI traffic offered another clear example of foundational strategy trumping speculative tactics. A detailed analysis directly traced the traffic spike to the company’s recent launch of 27 new, highly functional, and downloadable AI templates. These were not simple documents but practical tools—such as project management frameworks and complex financial models—that solved real, tangible problems for their target audience. The inherent value of these assets drove user engagement and discovery, a fact corroborated by a simultaneous 18% increase in Google organic traffic specifically to the new templates during the same period. The llms.txt file merely listed these valuable assets; it did not create the demand for them or enhance their utility. In both instances, the growth was a direct result of creating valuable, accessible, and authoritative content, demonstrating that core principles remain paramount.
A Standard Without Support and Its Limited Utility
The primary reason for the ineffectiveness of llms.txt is a straightforward yet critical issue: a complete lack of adoption by the major technology companies developing the AI models it is meant to guide. A thorough analysis of server logs from the participating websites confirmed that crawlers from leading AI providers—including OpenAI, Anthropic, Google, and Meta—rarely, if ever, requested the llms.txt file. At present, no major LLM developer has officially committed to parsing or honoring the directives within this standard, rendering it a well-intentioned piece of web infrastructure that currently has no functional purpose. The ambiguity surrounding the file was perfectly exemplified by Google’s actions. The company’s brief and unintentional implementation of llms.txt across some of its developer sites was initially interpreted by some as a strong endorsement. However, this notion was swiftly neutralized when a senior company representative clarified that the implementation was merely an unintentional side effect of a CMS update and was not being used for discovery purposes.
The study frequently draws an analogy between llms.txt and a traditional sitemap.xml file to explain its current role. Sitemaps are widely recognized as valuable, good-practice infrastructure that can help search engines discover a site’s content more efficiently. However, no experienced SEO professional would ever credit the sitemap file itself as the direct cause of traffic growth. It is the content listed within the sitemap—its quality, relevance, and authority—that ultimately drives performance. In the same way, llms.txt functions as a document that lists a site’s assets. While it might one day help AI models parse a site more efficiently if they choose to adopt it, the file does not inherently make the content more useful or authoritative. The strongest theoretical argument in its favor, token efficiency, is also extremely limited in its practical application. While providing clean content can save computational resources for AI agents, this benefit is primarily relevant to developer-focused platforms whose audiences use AI coding assistants to interact with API documentation, a scenario that does not apply to the vast majority of websites.
Strategic Focus Over Speculative Tactics
The study ultimately concluded that while implementing an llms.txt file was not harmful, it did not function as a strategic growth lever for any of the businesses at this time. The act of creating the file may have offered the comfort of taking concrete action in an uncertain AI landscape, but it did not translate into functional infrastructure that produced tangible results. The time and resources invested in this speculative effort would have been better allocated to proven strategies that demonstrably yield results in both traditional search and emerging AI-driven environments. The real drivers of success, as clearly demonstrated by the sites that did see growth, were the timeless fundamentals of modern SEO and a robust content strategy. These included creating functional, extractable assets like tools and structured data tables that provide direct, actionable value. Furthermore, reorganizing content into clear formats like comparison tables and FAQs made it easier for AI to pull information directly into answers. Finally, fixing technical barriers to ensure the site was easily crawlable and building authority through external validation proved to be the cornerstones of effective digital strategy.
