May 4, 2026

Custom robots.txt

You can now serve your own robots.txt at the root of your documentation site by pointing agents.robots-txt at a file in your repo. Use this to opt in or out of specific AI crawlers like GPTBot or ClaudeBot, gate sensitive sections from indexing, or signal training and search preferences with the Cloudflare Content Signals Policy. Your file is served verbatim at /robots.txt, and Fern appends a managed block disallowing internal API routes after your content.

docs.yml
1agents:
2 robots-txt: ./robots.txt
Read the docs

Multi-source docs

Multi-source docs lets each team publish independently to a shared custom domain — for example, docs.nvidia.com spans sub-paths like /nvcf, /brev, and /aiperf, each owned by a different product team and repository.

Set multi-source: true on the instance and reference a global theme for consistent branding across repositories.

docs.yml
1global-theme: my-org-theme
2
3instances:
4 - url: example.docs.buildwithfern.com/product-a
5 custom-domain: docs.example.com/product-a
6 multi-source: true
Read the docs