Creating an LLMs.txt file takes under 30 minutes if you know what you're doing. This tutorial walks through every step from planning your content selection to validating the deployed file - with ready-to-use templates for different types of sites.
Before You Start: Plan Your Content Selection
The value of LLMs.txt comes entirely from the quality of content you link to. Before writing a single line, answer these questions:
What are the 5–10 topics your site is most authoritative on?
Which pages have the most comprehensive, accurate, and useful content?
Which pages should AI engines use to understand who you are and what you do?
Are there any pages you want AI systems to avoid (thin content, drafts, login pages)?
Aim for 10–30 URLs total. Quality over quantity - a focused LLMs.txt pointing to genuinely excellent pages outperforms a sprawling list of mediocre ones.
Step 1: Create the File
Create a new plain text file named exactly llms.txt (lowercase, no spaces). Use any text editor. The file uses Markdown syntax - no special software required.
Step 2: Write the Header Block
The header identifies your site to AI systems. Use this format:
# [Your Site Name]
> [1–3 sentence description of your site's purpose, audience, and key topics.
> Write as if briefing an AI system on what your site is about and why it is
> authoritative on those topics.]
Example for a SaaS company:
# AI Rank Lab
> AI Rank Lab is a SaaS platform that helps marketers optimize their website
> visibility in AI search engines including ChatGPT, Gemini, Perplexity, and
> Claude. We publish authoritative guides on AEO, GEO, schema markup, LLMs.txt,
> and AI search strategy backed by original research.
Step 3: Organize Content into Sections
Use Markdown H2 headings to group your URLs by type. Each URL follows this format:
- [Page Title](https://yoursite.com/page-url): One-sentence description of what this page coversCommon section names that work well for AI comprehension:
## Core Guides- your most important educational content## Product Documentation- how-to content for your product## Research & Data- original data, reports, studies## Blog- curated selection from your blog
Step 4: Complete Template by Site Type
For a SaaS company blog:
# [Company Name]
> [Company] provides [product category] for [target audience].
> This site contains [type of content: guides, documentation, research].
## Guides
- [Guide Title 1](URL1): [Description]
- [Guide Title 2](URL2): [Description]
## Product Documentation
- [Feature Page](URL3): [Description]
## Research
- [Report Title](URL4): [Description]
Step 5: Deploy to Your Web Server Root
Place llms.txt at your domain root so it's accessible at https://yoursite.com/llms.txt. Deployment varies by platform:
WordPress: Upload via FTP/SFTP to your site's root directory (same folder as wp-config.php), or use a plugin like Yoast to manage it
Next.js: Place in the
/publicfolder - it will be served at/llms.txtautomaticallyNginx/Apache: Place in your web root directory (
/var/www/html/or equivalent)Shopify: Add as a custom page with URL handle
llmsand configure Content-Type via a Shopify app or theme modification
Step 6: Validate Your Deployment
After uploading, verify these checkpoints:
Visit
yoursite.com/llms.txtin a browser - you should see plain text, not a download promptCheck the Content-Type header using browser DevTools (Network tab) - should be
text/plainEnsure the file is accessible without logging in (test in incognito mode)
Verify robots.txt doesn't block the file - check that your robots.txt doesn't have
Disallow: /llms.txtTest all listed URLs are live and returning 200 status codes
Step 7: Keep It Current
LLMs.txt is most effective when current. Build a habit of updating it whenever you: publish a new pillar article, launch a new feature page, retire outdated content, or notice AI crawlers are missing your best work. Many teams add "update LLMs.txt" to their content publishing checklist.
LLMs.txt Real-World Examples by Site Type
Here are reference implementations for common site types:
Professional Services / Consultancy:
# Acme Consulting
> Acme Consulting provides management consulting for mid-market manufacturers.
> Our content covers operational efficiency, supply chain optimization, and
> manufacturing technology adoption based on 15 years of client engagements.
## Core Expertise
- [Supply Chain Disruption Playbook](https://acme.com/supply-chain-playbook): 12-step framework for building supply chain resilience
- [Manufacturing Technology ROI Calculator](https://acme.com/tech-roi): Data-driven model for evaluating automation investments
## Research & Reports
- [2026 Manufacturing Technology Survey](https://acme.com/survey-2026): 450-respondent survey on technology adoption rates
## About
- [Our Methodology](https://acme.com/methodology): How Acme approaches client engagements and measures outcomes
E-commerce / Product Site:
# GreenHome Store
> GreenHome Store sells sustainable home products for eco-conscious households.
> Our content focuses on product guides, sustainability comparisons, and practical
> advice for reducing household environmental impact.
## Product Guides
- [Sustainable Cleaning Products Comparison](https://greenhome.com/cleaning-guide): Side-by-side comparison of 40+ eco-cleaning products with ratings
- [Zero-Waste Kitchen Setup Guide](https://greenhome.com/zero-waste-kitchen): Step-by-step guide to eliminating single-use plastics from your kitchen
## Sustainability Resources
- [Carbon Footprint Calculator](https://greenhome.com/carbon-calculator): Estimates household carbon impact and reduction opportunities
Building an llms-full.txt Companion File
The llms-full.txt companion file (placed at /llms-full.txt) provides the full content of your key pages in a single AI-readable document, rather than just links. Some AI crawlers that don't follow links to individual pages will process llms-full.txt directly. Build it by:
Creating a plain text or Markdown document at
/llms-full.txtIncluding the full text of your 5–10 most important pages, separated by
---dividersIncluding page title, URL, and date as a header before each page's content
Keeping the file under 100KB for efficient AI crawler processing
Referencing it from your
llms.txtfile:- [Full Content Archive](/llms-full.txt): Complete text of our most important pages for AI processing
Troubleshooting Common LLMs.txt Deployment Issues
Problem | Likely Cause | Fix |
|---|---|---|
File downloads instead of displaying | Wrong Content-Type header | Configure server to serve .txt files as text/plain; in Next.js, ensure no next.config.js rewrites are interfering |
File returns 404 | Wrong file location or name | Verify exact path: must be at domain root, filename is |
File is blocked by robots.txt | Global Disallow rule | Add |
Links in file return 404 or 301 | Outdated URLs | Run a link check on all URLs in the file quarterly; update or remove dead links |
AI crawlers not visiting linked pages | Robots.txt blocking AI crawlers | Check robots.txt for GPTBot, ClaudeBot, Google-Extended blocks on linked directories |
HTTPS vs HTTP mismatch | Mixed content | Use canonical HTTPS URLs throughout the file |
LLMs.txt Update Frequency Recommendations
Different site types benefit from different update cadences:
News/fast-moving content sites: Review weekly; update when major new content is published or old content is significantly outdated
SaaS/product sites: Review with each product launch or major feature update; quarterly at minimum
Professional services/consulting: Review semi-annually; update when new case studies, reports, or major service changes occur
E-commerce: Review seasonally; ensure top guide content is current before peak seasons
Academic/research: Update each time a major paper or data set is published
Common LLMs.txt Mistakes to Avoid
Including too many URLs: A 200-URL LLMs.txt tells AI crawlers nothing about what's most important. Curate ruthlessly - 10–30 high-quality URLs outperform an exhaustive directory
Writing vague descriptions: "Our main article" is useless; "Step-by-step guide to setting up Stripe billing in Next.js with code examples" tells the AI exactly what it will find
Forgetting to update after content changes: LLMs.txt pointing to outdated or deleted content degrades AI trust in your site's reliability
Using relative URLs: Always use absolute URLs (https://yoursite.com/page) - relative paths don't work outside your site context
Not monitoring AI crawler access: Without checking server logs for GPTBot/ClaudeBot visits to your LLMs.txt, you won't know if the file is being used
Treating it as set-and-forget: LLMs.txt is a living document - the best implementations are reviewed and updated on a recurring schedule
Key Takeaways: Creating LLMs.txt
LLMs.txt creation takes under 30 minutes for most sites - there is no reason to delay deployment
The core format is simple: site description header + categorized URL list with one-sentence descriptions
Quality over quantity: 10–30 carefully chosen URLs dramatically outperform exhaustive directories
Deploy to
/llms.txtat your domain root; verify with browser, DevTools, and incognito modeConsider adding an
llms-full.txtcompanion file with full page text for crawlers that don't follow linksMaintain the file with quarterly reviews - a stale LLMs.txt with dead links or outdated content undermines the trust signal it provides to AI crawlers
For advanced LLMs.txt strategies, see our LLMs.txt best practices guide. Track your AI crawler activity with AI Rank Lab.
Frequently Asked Questions
How long does it take to create an LLMs.txt file?▾
How many URLs should I include in my LLMs.txt?▾
Does LLMs.txt need to be manually updated?▾
Can I have more than one LLMs.txt file?▾
What happens if I make a mistake in my LLMs.txt?▾
Should my LLMs.txt include competitor comparisons or negative content?▾
Written by
Devanshu
AI Search Optimization Expert



