Loading Now

What Is llms.txt? How to Add llms.txt in WordPress

What Is llms.txt? How to Add llms.txt in WordPress

Unlocking the Potential of llms.txt for Your Website

Recently, I noticed crawlers from platforms like OpenAI and Google showing up in my website analytics. Initially, I felt concern: Was my content being harvested without my consent? I also feared that a surge of requests from AI crawlers might hinder my site’s performance for users.

However, I soon realised that this could be an opportunity. What if I could steer AI tools—like ChatGPT—towards the content I’d prefer they indexed?

This led me to discover a new format named llms.txt. This file is crafted to assist large language models (LLMs) in identifying which sections of your site are the most beneficial. It enhances how your material appears in AI-driven responses and can even boost your site’s likelihood of being referenced as a source.

In this guide, I’ll walk you through creating an llms.txt file using either a plugin or a manual approach. Whether your goal is enhanced visibility in AI or simply having greater control, this is a fantastic starting point to tailor how AI engages with your content.

What Is an llms.txt File and Why Should You Consider It?

An llms.txt file is a newly proposed standard designed to provide AI tools like ChatGPT or Claude with a well-structured list of your website’s content that you’d like them to use for generating responses.

This file allows you to highlight your most valuable posts, tutorials, or landing pages—content that is clear, credible, and ideal for AI applications.

Think of it as a friendly welcome for AI: “If you’re going to reference my site, here’s what I recommend you review first.”

The file is stored at the root of your website (such as example.com/llms.txt) and is formatted in plain Markdown, permitting links to your sitemap, cornerstone content, or any other material you’d like to be cited.

By including your sitemap, you ensure that AI tools can access a comprehensive index of your site—even if they don’t individually follow every link provided.

This aligns with a growing strategy known as Generative Engine Optimisation (GEO). Some may refer to it as AI content optimisation or AI search visibility. The aim is to help AI models generate better answers while increasing the likelihood of your site being used as a reference.

Bear in mind, though, that llms.txt is still evolving. Not all AI companies are on board yet, but it’s a smart move for those seeking to influence how AI interacts with their content.

llms.txt vs. robots.txt: Understanding the Difference

You might be curious about how llms.txt differs from robots.txt, as both files deal with bots and visibility.

The main distinction is:

  • robots.txt specifies what crawlers are permitted to index and cache.
  • llms.txt offers AI models a curated list of the content you wish for them to reference when formulating AI-based responses.

Here’s a quick comparison:

Feature robots.txt llms.txt
Purpose Restrict search crawlers from accessing certain URLs Highlight your most valuable content for AI models
How it Works Employs User-agent and Disallow rules Utilises a Markdown list of recommended links
Effect on AI Can prevent AI models from accessing your site (if respected) May assist AI models in citing and summarising your key content
Adoption Widely supported by search engines and various AI tools Emerging; support is limited and voluntary

For a comprehensive AI strategy, you can employ both files simultaneously. Use llms.txt to welcome the AI bots you want, while employing robots.txt to block those you do not.

The guide to below will help you navigate the use of both files in managing your AI content strategy. You can jump to the method that suits you best:

Method 1: Generate an llms.txt File Using AIOSEO (Recommended)

The simplest way to create an llms.txt file in WordPress is by using the All in One SEO plugin (AIOSEO). I suggest this method since it handles everything automatically.

It generates a useful llms.txt file that directs AI crawlers to your relevant content and keeps the file updated as you publish new posts and pages.

Step 1: Install and Activate AIOSEO

Begin by installing and activating the AIOSEO plugin.

For a comprehensive tutorial, refer to our step-by-step guide on properly setting up All in One SEO.

The great news is that the llms.txt feature is enabled by default in all versions of AIOSEO, including the free version.

However, if you’re keen on having full control of your content and SEO, consider the additional powerful features you gain by upgrading to the AIOSEO Pro license.

While not essential for llms.txt, these features can be incredibly beneficial in boosting your website traffic:

  • Advanced Rich Snippets (Schema): The Pro version allows for more schema types, helping you achieve eye-catching rich results in Google (like reviews, recipes, or FAQs). Adding schema markup can also enhance how your content shows in AI searches.
  • Redirection Manager: This tool streamlines the process of redirecting bots or users from certain pages, fixing broken links, and monitoring 404 errors, providing you greater control over how both visitors and crawlers navigate your site.

While the llms.txt feature is free, upgrading gives you a much stronger toolkit for managing and growing your website’s presence.

Step 2: Confirm Your llms.txt File

Since this feature is enabled by default, there’s not much configuration needed. AIOSEO already assists in guiding AI bots for you.

You can check the settings by navigating to All in One SEO » General Settings and selecting the ‘Advanced’ tab.

In this section, the ‘Generate an LLMs.txt file’ toggle is activated by default.

When you click the ‘Open LLMs.txt’ button, you’ll see a list of links to your content—exactly what you need for GEO. It signals to AI bots that you’re welcoming them and guiding them through your resources.

Bear in mind, though, that llms.txt is not enforceable—AI tools may or may not choose to act on it.

Method 2: Manually Create an llms.txt File

If you prefer not to use a plugin, you can create an llms.txt file manually. This method involves generating a text file with a curated list of links to your most vital content.

Important: Ensure that another plugin isn’t already generating one for you. If you’re using AIOSEO for other SEO features, you’ll need to disable its default llms.txt file generator by going to All in One SEO » General Settings » Advanced.

Step 1: Create a New Text File

Open a plain text editor on your computer (such as Notepad on Windows or TextEdit on Mac).

Name the new file llms.txt.

Step 2: Insert Your Content Links

Next, add links to the content you want AI bots to access. The aim is to create a straightforward, clear map of your site using markdown headings and lists.

While you could simply list your major URLs, a best practice is to categorise them into sections. Always include a link to your XML sitemap, as it effectively presents all your public content to bots.

Here’s a structured template you can use for your llms.txt file. Just substitute the example URLs with your own:

# My Amazing Website

Sitemaps

Key Pages

Key Posts

Step 3: Upload the File to Your Website

After saving your file, upload it to your website’s root directory, commonly named public_html or www.

You can achieve this using an FTP client or the File Manager in your WordPress hosting dashboard. Simply upload the llms.txt file from your device to this folder.

Step 4: Confirm Your File is Live

Finally, verify that your file is operational by visiting yourdomain.com/llms.txt in your browser. You should see the list of links you just created.

Bonus: Blocking AI Bots with Your robots.txt File

While guiding AI bots via llms.txt is beneficial for GEO, you might choose to block them altogether. If your goal is to restrict AI companies from using your content for training, the proper method involves adding rules to your robots.txt file.

Your robots.txt file is a powerful tool directing web crawlers. For a comprehensive overview, I suggest checking our complete guide on refining your WordPress robots.txt file.

Important: Altering your robots.txt file can be risky. A small error could inadvertently block major search engines like Google from crawling your site, harming your SEO. We recommend using a plugin like AIOSEO to manage this safely.

Method 1: Modify robots.txt via the AIOSEO Plugin (Recommended)

If you’re already using All in One SEO, this is the safest and easiest way to restrict AI bots. The plugin includes a built-in robots.txt editor that helps prevent errors.

Navigate to All in One SEO » Tools in your WordPress dashboard. Locate and click the ‘Robots.txt Editor’ tab.

First, toggle the switch to enable a custom robots.txt.

You’ll then see an editor where you can add your custom rules. To block specific AI bots, click the ‘Add Rule’ button. Fill in the fields with the User-agent (the bot’s name) and a Disallow rule.

For instance, to block OpenAI’s bot, you would add:

User-agent: GPTBot
Disallow: /

Add rules for as many bots as you need. I’ve included a list of common AI crawlers at the end of this section.

Once you’ve completed your changes, click the ‘Save Changes’ button.

Method 2: Manually Modify robots.txt via FTP

If you don’t use a plugin, you can manually edit the file. This involves connecting to your site’s root directory using an FTP client or the File Manager in your hosting account.

Locate your robots.txt file in your site’s root folder and download it, ensuring you do not delete it.

Open the file in a plain text editor and append your blocking rules at the end of the file.

For example, to restrict Google’s AI crawler, include:

User-agent: Google-Extended
Disallow: /

Save the file and upload it back to the same root directory, overwriting the existing one.

Common AI Bots to Consider Blocking

Here are some common AI user agents you may want to restrict:

  • GPTBot (OpenAI)
  • Google-Extended (Google AI)
  • anthropic-ai (Anthropic / Claude)
  • CCBot (Common Crawl)

You can create separate blocking rules for each of these in your robots.txt file.

FAQs About llms.txt and robots.txt in WordPress

Here are some frequently asked questions regarding managing AI crawlers:

  1. Will creating an llms.txt file impact my website’s SEO?
    Creating an llms.txt file won’t influence your traditional SEO rankings. Search engines like Google still rely on your robots.txt file and other SEO indicators to determine what gets indexed and ranked.
  2. <li><strong>Can using an llms.txt file increase my traffic from AI?</strong><br>Implementing an <strong>llms.txt</strong> file isn’t a guarantee for increased traffic from AI tools. It can assist in directing models like ChatGPT to the content you wish them to examine, but there's no assurance they will use it or link to your site.</li>
    
    <li><strong>What distinguishes llms.txt from robots.txt?</strong><br>An <strong>llms.txt</strong> file serves as a guide for AI models, directing them to the content you wish them to see—your most beneficial posts and pages. It aims to enhance your GEO strategy by showcasing what deserves citation. Conversely, a <strong>robots.txt</strong> file blocks crawlers from accessing specific areas of your site. You use <strong>llms.txt</strong> to say “look here,” and <strong>robots.txt</strong> to say “don’t go there.”</li>

Wrapping Up Your Content Strategy for the Future

The landscape of AI and Generative Engine Optimization is evolving rapidly. I recommend reassessing your strategy every few months.

A bot you decide to block today might become a significant traffic source tomorrow, so remain flexible and ready to adjust. You can always transition from blocking to guiding (or vice-versa) as your business objectives change.

I hope this guide empowers you to make informed decisions about the future of your content in the AI realm. If you found it helpful, consider exploring our other resources on growing and safeguarding your website: