AI SEO

How to Create an LLMs.txt File: Step-by-Step Tutorial for 2026

A hands-on, step-by-step tutorial for creating, validating, and deploying an LLMs.txt file - including templates for different site types and a checklist to ensure correct configuration.

Devanshu
8 min read
Featured image for How to Create an LLMs.txt File: Step-by-Step Tutorial for 2026

Creating an LLMs.txt file takes under 30 minutes if you know what you're doing. This tutorial walks through every step from planning your content selection to validating the deployed file - with ready-to-use templates for different types of sites.

Before You Start: Plan Your Content Selection

The value of LLMs.txt comes entirely from the quality of content you link to. Before writing a single line, answer these questions:

  • What are the 5–10 topics your site is most authoritative on?

  • Which pages have the most comprehensive, accurate, and useful content?

  • Which pages should AI engines use to understand who you are and what you do?

  • Are there any pages you want AI systems to avoid (thin content, drafts, login pages)?

Aim for 10–30 URLs total. Quality over quantity - a focused LLMs.txt pointing to genuinely excellent pages outperforms a sprawling list of mediocre ones.

Step 1: Create the File

Create a new plain text file named exactly llms.txt (lowercase, no spaces). Use any text editor. The file uses Markdown syntax - no special software required.

Step 2: Write the Header Block

The header identifies your site to AI systems. Use this format:


# [Your Site Name]

> [1–3 sentence description of your site's purpose, audience, and key topics.
> Write as if briefing an AI system on what your site is about and why it is
> authoritative on those topics.]

Example for a SaaS company:


# AI Rank Lab

> AI Rank Lab is a SaaS platform that helps marketers optimize their website
> visibility in AI search engines including ChatGPT, Gemini, Perplexity, and
> Claude. We publish authoritative guides on AEO, GEO, schema markup, LLMs.txt,
> and AI search strategy backed by original research.

Step 3: Organize Content into Sections

Use Markdown H2 headings to group your URLs by type. Each URL follows this format:

- [Page Title](https://yoursite.com/page-url): One-sentence description of what this page covers

Common section names that work well for AI comprehension:

  • ## Core Guides - your most important educational content

  • ## Product Documentation - how-to content for your product

  • ## Research & Data - original data, reports, studies

  • ## Blog - curated selection from your blog

Step 4: Complete Template by Site Type

For a SaaS company blog:


# [Company Name]

> [Company] provides [product category] for [target audience].
> This site contains [type of content: guides, documentation, research].

## Guides
- [Guide Title 1](URL1): [Description]
- [Guide Title 2](URL2): [Description]

## Product Documentation
- [Feature Page](URL3): [Description]

## Research
- [Report Title](URL4): [Description]

Step 5: Deploy to Your Web Server Root

Place llms.txt at your domain root so it's accessible at https://yoursite.com/llms.txt. Deployment varies by platform:

  • WordPress: Upload via FTP/SFTP to your site's root directory (same folder as wp-config.php), or use a plugin like Yoast to manage it

  • Next.js: Place in the /public folder - it will be served at /llms.txt automatically

  • Nginx/Apache: Place in your web root directory (/var/www/html/ or equivalent)

  • Shopify: Add as a custom page with URL handle llms and configure Content-Type via a Shopify app or theme modification

Step 6: Validate Your Deployment

After uploading, verify these checkpoints:

  1. Visit yoursite.com/llms.txt in a browser - you should see plain text, not a download prompt

  2. Check the Content-Type header using browser DevTools (Network tab) - should be text/plain

  3. Ensure the file is accessible without logging in (test in incognito mode)

  4. Verify robots.txt doesn't block the file - check that your robots.txt doesn't have Disallow: /llms.txt

  5. Test all listed URLs are live and returning 200 status codes

Step 7: Keep It Current

LLMs.txt is most effective when current. Build a habit of updating it whenever you: publish a new pillar article, launch a new feature page, retire outdated content, or notice AI crawlers are missing your best work. Many teams add "update LLMs.txt" to their content publishing checklist.

LLMs.txt Real-World Examples by Site Type

Here are reference implementations for common site types:

Professional Services / Consultancy:


# Acme Consulting

> Acme Consulting provides management consulting for mid-market manufacturers.
> Our content covers operational efficiency, supply chain optimization, and
> manufacturing technology adoption based on 15 years of client engagements.

## Core Expertise
- [Supply Chain Disruption Playbook](https://acme.com/supply-chain-playbook): 12-step framework for building supply chain resilience
- [Manufacturing Technology ROI Calculator](https://acme.com/tech-roi): Data-driven model for evaluating automation investments

## Research & Reports
- [2026 Manufacturing Technology Survey](https://acme.com/survey-2026): 450-respondent survey on technology adoption rates

## About
- [Our Methodology](https://acme.com/methodology): How Acme approaches client engagements and measures outcomes

E-commerce / Product Site:


# GreenHome Store

> GreenHome Store sells sustainable home products for eco-conscious households.
> Our content focuses on product guides, sustainability comparisons, and practical
> advice for reducing household environmental impact.

## Product Guides
- [Sustainable Cleaning Products Comparison](https://greenhome.com/cleaning-guide): Side-by-side comparison of 40+ eco-cleaning products with ratings
- [Zero-Waste Kitchen Setup Guide](https://greenhome.com/zero-waste-kitchen): Step-by-step guide to eliminating single-use plastics from your kitchen

## Sustainability Resources
- [Carbon Footprint Calculator](https://greenhome.com/carbon-calculator): Estimates household carbon impact and reduction opportunities

Building an llms-full.txt Companion File

The llms-full.txt companion file (placed at /llms-full.txt) provides the full content of your key pages in a single AI-readable document, rather than just links. Some AI crawlers that don't follow links to individual pages will process llms-full.txt directly. Build it by:

  1. Creating a plain text or Markdown document at /llms-full.txt

  2. Including the full text of your 5–10 most important pages, separated by --- dividers

  3. Including page title, URL, and date as a header before each page's content

  4. Keeping the file under 100KB for efficient AI crawler processing

  5. Referencing it from your llms.txt file: - [Full Content Archive](/llms-full.txt): Complete text of our most important pages for AI processing

Troubleshooting Common LLMs.txt Deployment Issues

Problem

Likely Cause

Fix

File downloads instead of displaying

Wrong Content-Type header

Configure server to serve .txt files as text/plain; in Next.js, ensure no next.config.js rewrites are interfering

File returns 404

Wrong file location or name

Verify exact path: must be at domain root, filename is llms.txt (case-sensitive)

File is blocked by robots.txt

Global Disallow rule

Add Allow: /llms.txt to robots.txt above any Disallow rules

Links in file return 404 or 301

Outdated URLs

Run a link check on all URLs in the file quarterly; update or remove dead links

AI crawlers not visiting linked pages

Robots.txt blocking AI crawlers

Check robots.txt for GPTBot, ClaudeBot, Google-Extended blocks on linked directories

HTTPS vs HTTP mismatch

Mixed content

Use canonical HTTPS URLs throughout the file

LLMs.txt Update Frequency Recommendations

Different site types benefit from different update cadences:

  • News/fast-moving content sites: Review weekly; update when major new content is published or old content is significantly outdated

  • SaaS/product sites: Review with each product launch or major feature update; quarterly at minimum

  • Professional services/consulting: Review semi-annually; update when new case studies, reports, or major service changes occur

  • E-commerce: Review seasonally; ensure top guide content is current before peak seasons

  • Academic/research: Update each time a major paper or data set is published

Common LLMs.txt Mistakes to Avoid

  • Including too many URLs: A 200-URL LLMs.txt tells AI crawlers nothing about what's most important. Curate ruthlessly - 10–30 high-quality URLs outperform an exhaustive directory

  • Writing vague descriptions: "Our main article" is useless; "Step-by-step guide to setting up Stripe billing in Next.js with code examples" tells the AI exactly what it will find

  • Forgetting to update after content changes: LLMs.txt pointing to outdated or deleted content degrades AI trust in your site's reliability

  • Using relative URLs: Always use absolute URLs (https://yoursite.com/page) - relative paths don't work outside your site context

  • Not monitoring AI crawler access: Without checking server logs for GPTBot/ClaudeBot visits to your LLMs.txt, you won't know if the file is being used

  • Treating it as set-and-forget: LLMs.txt is a living document - the best implementations are reviewed and updated on a recurring schedule

Key Takeaways: Creating LLMs.txt

  • LLMs.txt creation takes under 30 minutes for most sites - there is no reason to delay deployment

  • The core format is simple: site description header + categorized URL list with one-sentence descriptions

  • Quality over quantity: 10–30 carefully chosen URLs dramatically outperform exhaustive directories

  • Deploy to /llms.txt at your domain root; verify with browser, DevTools, and incognito mode

  • Consider adding an llms-full.txt companion file with full page text for crawlers that don't follow links

  • Maintain the file with quarterly reviews - a stale LLMs.txt with dead links or outdated content undermines the trust signal it provides to AI crawlers

For advanced LLMs.txt strategies, see our LLMs.txt best practices guide. Track your AI crawler activity with AI Rank Lab.

Frequently Asked Questions

How long does it take to create an LLMs.txt file?
Planning your content selection takes 15–20 minutes. Writing the file takes another 15–20 minutes. Deployment takes 5–10 minutes depending on your platform. Total: under 1 hour for a complete, well-crafted LLMs.txt file.
How many URLs should I include in my LLMs.txt?
Aim for 10–30 URLs representing your most authoritative and useful content. This size is manageable for AI crawlers and focused enough to signal clear topical priorities. Avoid padding with low-quality URLs - quality over quantity.
Does LLMs.txt need to be manually updated?
Yes - LLMs.txt is a static file you maintain manually. However, some CMS plugins (particularly for WordPress) can auto-update it based on post tags or categories. Add updating LLMs.txt to your content publishing workflow to keep it current.
Can I have more than one LLMs.txt file?
You can have one main llms.txt and optionally an llms-full.txt that contains complete page content for AI systems that want richer context. Some sites also create subdomain-specific LLMs.txt files. The root llms.txt is most important.
What happens if I make a mistake in my LLMs.txt?
Unlike robots.txt errors (which can block crawling), LLMs.txt errors are low-risk - AI systems will simply ignore invalid syntax or broken URLs. However, broken links waste crawl budget. Validate all URLs return 200 status before adding them.
Should my LLMs.txt include competitor comparisons or negative content?
Only include content that represents your site's best work. Comparison pages are fine if they're objective and high-quality. Avoid including content that makes negative claims about others or that you wouldn't want associated with your brand in AI answers.

Written by

Devanshu

AI Search Optimization Expert

Enjoyed this article?

Subscribe to our newsletter and get the latest AI search optimization insights delivered to your inbox.

No spam, unsubscribe at any time. We respect your privacy.