Meet /llms.txt: The AI-First Treasure Map Every Site Needs

This started as a small post that ballooned as I dug in and had more questions. You can give this post to your AI buddy or NotebookLM to have a discussion about how to win in Web X.0.

  1. Why /llms.txt Matters
  2. A Quick History
  3. What Does /llms.txt Look Like?
  4. DIY or Automate?
  5. Good Practices Checklist
  6. Early Results & Adoption
  7. How Do LLMs Discover /llms.txt?
    1. Current Discovery Methods
    2. Future Potential for Automatic Discovery
    3. Recommended Good Practices (Today)
  8. How to Submit Your URL Directly to AI Tools?
    1. Quick-check: Is Your URL Ready?
  9. How to Experiment with /llms.txt?
    1. 1. Define Clear Objectives
    2. 2. Choose Targeted Metrics to Track
    3. 3. Create and Implement a Test Version
    4. 4. Baseline Measurement
    5. 5. Launch Your Experiment
    6. 6. Monitor and Analyze Results
    7. 7. Iterate and Optimize
    8. Common Pitfalls to Avoid
  10. The Bottom Line
    1. Let’s talk about it.

I am on a journey to transform from an enterprise software developer into an AI Engineer. Additionally, I am becoming an AgenticOps Operator. I’m learning so much. I’ve said it before on this blog that the way we consume the internet is changing. AI agents are increasingly taking over how we consume the internet. They provide us with relevant and curated content without us having to leave the chat UI. That means that businesses need to rethink SEO to attract these agents if they want to reach us.

Imagine giving ChatGPT, Claude, or Perplexity an LLM focused sitemap highlighting exactly where your site’s most valuable content lives. That’s the growing power of /llms.txt, the latest practice borrowed from SEO, but made specifically for large-language models (LLMs).

Why /llms.txt Matters

AI tools like ChatGPT don’t pre-crawl your entire site; they pull pages in real-time, often burning valuable context window space on ads, nav bars, and irrelevant HTML elements. This inefficiency leads to missed content and inaccurate AI-generated answers. Enter /llms.txt: a concise, Markdown-formatted “treasure map” guiding LLMs directly to your high-value content.

I’m not saying this is the answer or that it will solve LLM search results for your website, but this is a start and a move in the right direction that doesn’t take a huge budget to experiment with.

A Quick History

The /llms.txt concept kicked off when Jeremy Howard (Answer.ai) proposed it in September 2024. Mintlify boosted its popularity, auto-generating /llms.txt files for thousands of SaaS documentation sites. Soon, Anthropic adopted the format, sparking broader acceptance across AI and SEO communities.

What Does /llms.txt Look Like?

Here’s a minimal example:

# Project Name

> A clear, concise summary of your site’s purpose.

## Core Docs

- [Quick Start](https://example.com/docs/quick-start): Installation and basic usage.

- [API Reference](https://example.com/api): Comprehensive REST and webhook documentation.

## Optional

- [Changelog](https://example.com/changelog): Latest updates and release notes.

Each heading creates a clear hierarchy, with short descriptions guiding AI directly to the content you want featured.

Here is a more complete version – https://www.fastht.ml/docs/llms.txt.

DIY or Automate?

Creating /llms.txt is straightforward:

  • Choose 10-20 golden pages covering your most critical content.
  • Write concise descriptions (no keyword stuffing needed).
  • Host it at https://your-domain.com/llms.txt.

If you’d rather automate:

  • Plugins like the WordPress LLMs-Full.txt Generator or tools like Firecrawl and Mintlify CLI simplify the process, ensuring your map stays fresh.

Here’s an easy way to start building your /llms.txt today. Give your favorite LLM chatbot (ChatGPT, Claude, Gemini…) links to the pages you want to list on your website. Provide it with a link to https://decoupledlogic.com/2025/07/28/meet-llms-txt-the-ai-first-treasure-map-every-site-needs/ and and https://llmstxt.org/ to provide it with context on /llms.txt. Then ask it to “create an llm.txt file for our website”, and see what you get.

Then submit your /llms.txt to https://directory.llmstxt.cloud/, https://llmstxt.site/, and llmstxthub.com.

Good Practices Checklist

  • Limit your list to fewer than 25 high-value links.
  • Avoid redirects or query parameters.
  • Provide helpful, readable summaries, not keywords.
  • Include only reliable, versioned content.
  • /llms.txt isn’t private, use robots.txt alongside it for protection.

Early Results & Adoption

As of mid-2025:

  • Prominent sites are actively maintaining /llms.txt and the list is growing
  • Popular SEO plugins like Yoast and Rank Math now support it.
  • Tools like LangChain and Cursor demonstrate significant improvements in AI citation accuracy.
  • We are very early and all of this may change if Google decides to jump into this space.

How Do LLMs Discover /llms.txt?

This was my biggest question when I heard about /llms.txt. Here’s what I understand.
Currently, there’s no standardized automatic discovery for /llms.txt like there is for robots.txt or sitemap.xml. Instead, LLMs primarily discover /llms.txt through manual or indirect methods:

Current Discovery Methods

  1. Manual Submission:
    • You explicitly feed your /llms.txt URL to AI agents like ChatGPT (Browse), Claude, Perplexity, or custom-built tools (Semantic Kernel, LangChain, LangGraph).
    • Typically, you’d provide the direct URL (e.g., https://your-site.com/llms.txt) for ingestion.
  2. Community Directories:
  3. Integration with AI Platforms:
    • Platforms like Mintlify, Firecrawl, or LangChain may ingest URLs proactively from known sources or integrations with SEO/LLM plugins.

Future Potential for Automatic Discovery

A standardized discovery process (like the established robots.txt approach) is likely to emerge as /llms.txt gains adoption:

  • Root-level probing (https://your-site.com/llms.txt) could become a default behavior for AI crawlers.
  • Inclusion in a sitemap (e.g., referencing /llms.txt from your sitemap.xml) could assist automated discovery.

Currently, these methods are under active discussion within AI and SEO communities.

  • Explicitly share your /llms.txt URL directly with the platforms you’re targeting.
  • Submit your URL to community directories (like directory.llmstxt.cloud) to improve visibility.
  • Monitor emerging standards to adapt quickly once automatic discovery becomes standard.

How to Submit Your URL Directly to AI Tools?

1. ChatGPT (Browse with Bing)

  • Open ChatGPT with Browsing enabled.
  • Paste your URL with a clear prompt: "Please read and summarize the key points from https://your-site.com/llms.txt"

2. Anthropic Claude or Perplexity

  • Simply paste your URL directly into the chat and prompt clearly: "Review our documentation here: https://your-site.com/llms.txt and answer any product-related questions based on it."

3. LangChain/LangGraph (Python Example)

  • For API-based ingestion, use the following snippet: from langchain.document_loaders import WebBaseLoader loader = WebBaseLoader("https://your-site.com/llms.txt") docs = loader.load()
  • Once loaded, your content is available for inference in your custom AI apps.

Quick-check: Is Your URL Ready?

  • Paste your URL into any browser:
    • It should directly show your markdown content (no login or redirects).
  • Example success case: # Company Documentation > Core Product Resources ## Essential Pages - [Getting Started](https://your-site.com/start): Quick onboarding steps.

How to Experiment with /llms.txt?

Here’s how your company can effectively experiment with /llms.txt to measure its real-world impact:

1. Define Clear Objectives

Start by specifying exactly what you want to achieve. Typical objectives include:

  • Better AI-generated citations
  • Increased discoverability of key content
  • Improved user engagement via AI-driven channels

2. Choose Targeted Metrics to Track

Focus on measurable outcomes, including:

  • Citations and backlinks from AI tools
  • Organic traffic from AI-powered search results
  • Changes in session duration or page depth from AI referrals
  • Reduction in support queries due to better AI answers

3. Create and Implement a Test Version

Develop a concise /llms.txt:

  • Select 10–20 high-value pages.
  • Clearly label your content and hierarchy.
  • Deploy it at the root: your-domain.com/llms.txt.

Example structure:

# Company Name
> Your one-line value proposition.

## Core Content
- [Getting Started](https://your-domain.com/start): Setup & onboarding guide.
- [Product Overview](https://your-domain.com/product): Key features & use cases.

## Secondary Content
- [Knowledge Base](https://your-domain.com/kb): Common issues & solutions.

4. Baseline Measurement

Before releasing the /llms.txt publicly:

  • Capture existing traffic metrics from AI-generated citations.
  • Document current quality of AI-generated summaries and answers referencing your content.

5. Launch Your Experiment

Share your /llms.txt URL manually with key inference-time agents to jumpstart discovery:

  • ChatGPT (Browsing), Claude, Perplexity, Google AI Overviews, LangChain tools, etc.

(Automatic discovery isn’t standardized yet, so manual submission is essential.)

6. Monitor and Analyze Results

Regularly check (weekly or monthly) and add to reporting:

  • Increases in AI-driven referral traffic
  • Quality improvements in content citations by AI
  • Enhanced accuracy in AI-generated summaries or Q&A referencing your site
  • User behavior analytics from referrals (page views, bounce rates, conversions)

7. Iterate and Optimize

Based on your findings, adjust your /llms.txt strategy:

  • Add or remove pages based on AI citation performance.
  • Improve content descriptions to guide AI context better.
  • Consider automation tools (like Mintlify CLI or Firecrawl) for frequent updates.

Common Pitfalls to Avoid

  • Keep /llms.txt tight and precise.
  • Provide clear, succinct context—AI reads the prose carefully.
  • Regularly update the file based on actual usage data.

The Bottom Line

Think of /llms.txt as your AI landing page, optimized to direct AI models straight to your best resources. Regularly updating this file with your release cycle boosts AI-driven engagement, accuracy, and visibility in an increasingly AI-first world. ASO (Agent Search Optimization) and similar concepts are becoming a thing and it’s a good time to start learning more about the coming changes to SEO and internet marketing.

Let’s talk about it.

Ready to implement your own AI-first treasure map? Starting your /llms.txt experiment today positions your brand ahead of the curve in optimizing AI-driven discovery and citation quality. Would you like support setting up measurement tools, or perhaps a sample /llms.txt tailored specifically for your site?

Want to dig in on it, let’s talk about. I’m always down to talking about all things AI, agents, and AgenticOps.

Leave a comment