Customize LLM output

View as Markdown

Customize what LLMs receive from your site by filtering content within a page using <llms-only> and <llms-ignore> tags, filtering endpoint output with query parameters, excluding pages with noindex, or serving your own custom files.

Filter within a page

Within pages, use <llms-only> and <llms-ignore> tags to control what content is exposed to AI versus human readers on your documentation site. These tags affect all Markdown output intended for AI consumption: LLM endpoints, Copy page, and View as Markdown.

Content for AI only

Use the <llms-only> tag to show content to AI but hide it from human readers on your documentation site. This is useful for:

  • Technical context that’s verbose but helpful for AI, like implementation details or architecture notes
  • Code-related metadata that would clutter the human UI
  • Cross-references that help AI understand relationships between pages

Content for humans only

Use the <llms-ignore> tag to show content to human readers on your documentation site while hiding it from AI. This is useful for:

  • Marketing CTAs or promotional content
  • Navigation hints meant only for human readers
  • Internal comments that should remain only in source files

Example

In the example below, human readers see the sign-up callout but not the scope note, while AI agents receive the scope note but not the callout:

docs/authentication.mdx
1## Authentication
2
3<llms-only>
4This endpoint requires a bearer token scoped to `read:users`.
5</llms-only>
6
7Include your API key in the `Authorization` header.
8
9<llms-ignore>
10 <Callout intent="info">
11 Need an API key? [Sign up for free](https://example.com/signup) to get started.
12 </Callout>
13</llms-ignore>

Authentication

Include your API key in the Authorization header.

Need an API key? Sign up for free to get started.

To preview and debug what AI sees for any page, you can append .md or .mdx to the page URL to view its Markdown source.

Filter endpoint output

Filter llms.txt and llms-full.txt output with the lang and excludeSpec query parameters to reduce token usage. Parameters can also be combined.

Example
/llms.txt?lang=python
/llms-full.txt?excludeSpec=true
/llms-full.txt?lang=python&excludeSpec=true
lang
'node' | 'python' | 'java' | 'ruby' | 'go' | 'csharp' | 'swift'

Filter SDK code examples to a specific language. Common aliases are also accepted: javascript, typescript, js, ts, py, and golang. Case-insensitive.

excludeSpec
boolean

Exclude OpenAPI and AsyncAPI specification sections.

Exclude whole pages

You can exclude whole pages from LLM endpoints (llms.txt and llms-full.txt) by adding noindex: true to the page’s frontmatter. Pages marked noindex aren’t indexed by search engines but remain visible in your sidebar navigation and can still be accessed directly by URL. To also hide a page from navigation, see Hiding content.

docs/pages/internal-notes.mdx
1---
2title: Internal notes
3noindex: true
4---

Serve custom files

To serve your own llms.txt or llms-full.txt instead of the auto-generated versions, point to your files under the agents key in docs.yml:

docs.yml
1agents:
2 llms-txt: ./path/to/llms.txt
3 llms-full-txt: ./path/to/llms-full.txt

Paths are relative to the docs.yml file. The CLI validates that each file exists and uploads it as part of your docs deployment. Your custom files are served at the root-level /llms.txt and /llms-full.txt endpoints. Nested paths (e.g., /api-reference/llms.txt) continue to use the auto-generated output.

You can provide one or both files. Any file you don’t specify falls back to the auto-generated version.