Customize LLM output
Customize what LLMs receive from your site by filtering content within a page using <llms-only> and <llms-ignore> tags, filtering endpoint output with query parameters, excluding pages with noindex, or serving your own custom files.
Filter within a page
Within pages, use <llms-only> and <llms-ignore> tags to control what content is exposed to AI versus human readers on your documentation site.
Content for AI only
<llms-only> content appears only on the LLM-serving endpoints that external agents (like Claude, ChatGPT, or Cursor) fetch directly: /llms.txt, /llms-full.txt, and each page’s .md/.mdx source. It’s hidden from every human-facing surface, including the rendered page, Copy page, the search widget, and Ask Fern.
Use <llms-only> for:
- Technical context that’s verbose but helpful for AI, like implementation details or architecture notes
- Code-related metadata that would clutter the human UI
- Cross-references that help AI understand relationships between pages
Content for humans only
<llms-ignore> content appears on every human-facing surface, including the rendered page, Copy page, the search widget, and Ask Fern (which indexes and can cite it like any other page content). It’s stripped from the LLM-serving endpoints (/llms.txt, /llms-full.txt, and each page’s .md/.mdx source).
Use <llms-ignore> for:
- Marketing CTAs or promotional content
- Navigation hints meant only for human readers
- Internal comments that should remain only in source files
Example
Fern’s own docs quickstart pairs GitHub UI click-through steps with their CLI equivalent:
Use the fern-api/docs-starter repository as a template for your site:
- Navigate to fern-api/docs-starter and click on the Use this template button (found at the top right of this page). You must be logged into GitHub.
- Choose the option to create a new repository. Name it
fern-docs. - Clone your newly created repository and open it in your favorite code editor (e.g., Cursor, VS Code).
On the docs site, human readers see only the numbered UI steps — the <llms-only> block is hidden. LLMs reading this page’s Markdown output see the inverse: no UI steps, only the gh repo create command they can actually run. Each audience gets the path they can act on.
The guiding principle: UI-only elements belong to human readers, and their programmatic equivalents belong to AI agents. Wrap clickable cards and UI walkthroughs in <llms-ignore> so agents skip them, and pair each UI step with an <llms-only> block that gives the CLI or API equivalent.
To preview and debug what AI sees for any page, you can append .md or .mdx to the page URL to view its Markdown source.
Filter endpoint output
Filter llms.txt and llms-full.txt output with the lang and excludeSpec query parameters to reduce token usage. Parameters can also be combined.
Filter SDK code examples to a specific language. Common aliases are also accepted: javascript, typescript, js, ts, py, and golang. Case-insensitive.
Exclude OpenAPI and AsyncAPI specification sections.
Exclude whole pages
You can exclude whole pages from LLM endpoints (llms.txt and llms-full.txt) by adding noindex: true to the page’s frontmatter. Pages marked noindex aren’t indexed by search engines but remain visible in your sidebar navigation and can still be accessed directly by URL. To also hide a page from navigation, see Hiding content.
Serve custom files
To serve your own llms.txt or llms-full.txt instead of the auto-generated versions, point to your files under the agents key in docs.yml:
Paths are relative to the docs.yml file. The CLI validates that each file exists and uploads it as part of your docs deployment. Your custom files are served at the root-level /llms.txt and /llms-full.txt endpoints. Nested paths (e.g., /api-reference/llms.txt) continue to use the auto-generated output.
You can provide one or both files. Any file you don’t specify falls back to the auto-generated version.
To control which crawlers reach your site rather than what they receive, serve a custom robots.txt instead.