Skip to content

feat(metadata): support non-standard directives in robots.ts via other#93206

Merged
timneutkens merged 1 commit intocanaryfrom
fix-robots-non-standard-directives-clean
May 1, 2026
Merged

feat(metadata): support non-standard directives in robots.ts via other#93206
timneutkens merged 1 commit intocanaryfrom
fix-robots-non-standard-directives-clean

Conversation

@timneutkens
Copy link
Copy Markdown
Contributor

What?

Adds an other field to each rule in MetadataRoute.Robots so app/robots.ts can emit non-standard per-user-agent directives (e.g. Seznam Request-Rate, Yandex Clean-param) alongside the standard allow/disallow/crawlDelay fields.

// app/robots.ts
export default function robots(): MetadataRoute.Robots {
  return {
    rules: [
      { userAgent: '*', allow: '/' },
      {
        userAgent: 'SeznamBot',
        allow: '/',
        other: { 'Request-Rate': '10/1m' },
      },
    ],
  }
}

emits:

User-Agent: *
Allow: /

User-Agent: SeznamBot
Allow: /
Request-Rate: 10/1m

Why?

The current resolveRobots serializer only writes the four standard rule fields and silently drops everything else, so users who need non-standard directives (common for Seznam, Yandex, and other crawlers) have to delete robots.ts entirely and hand-maintain a static robots.txt, losing the type-safe, code-driven workflow.

How?

  • Type (packages/next/src/lib/metadata/types/metadata-interface.ts): factored a shared RobotsRuleBase containing the existing optional fields plus other?: Record<string, string | number | Array<string | number>>. RobotsFile.rules still enforces userAgent as required on the array form.
  • Serializer (packages/next/src/build/webpack/loaders/metadata/resolve-route-data.ts): after Crawl-delay and before the trailing blank line, walks rule.other and emits Key: value lines preserving casing. Array values expand into repeated lines; null/undefined entries are skipped. Keeps non-standard directives scoped to their User-Agent block.
  • Tests: added two resolveRobots cases covering the Request-Rate reproduction and an array-valued Clean-param, plus null/undefined handling. Existing snapshots are unchanged, proving the change is additive.
  • Docs: new "Non-standard directives" section in docs/01-app/03-api-reference/03-file-conventions/01-metadata/robots.mdx with TS/JS examples and updated Robots type reference.

Fixes #89521

Adds an `other` field on each rule in `MetadataRoute.Robots` so users can
declare non-standard robots.txt directives (e.g. Seznam `Request-Rate`,
Yandex `Clean-param`) alongside the standard fields. Values are passed
through verbatim and scoped to the rule's User-Agent block. Array values
emit one line per entry.

Fixes #89521
@timneutkens timneutkens merged commit 65340b2 into canary May 1, 2026
181 of 182 checks passed
@timneutkens timneutkens deleted the fix-robots-non-standard-directives-clean branch May 1, 2026 10:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support non-standard properties in robots.ts file

3 participants