Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add robots.txt to restrict crawling of version-specific docs #1056

Merged
merged 1 commit into from
Apr 1, 2025

Conversation

josh-wong
Copy link
Member

@josh-wong josh-wong commented Apr 1, 2025

Description

This PR adds a robots.txt file to restrict search engines from crawling the docs site so that only the latest version of docs appear as results. We need to implement this because the results are random, often show older or no longer maintained versions of docs, or aren't versions of docs that visitors are using, which is confusing.

Related issues and/or PRs

N/A

Changes made

  • Created a robots.txt file that:
    • Disallows crawling of version number folders (/docs/3.14/, /ja-jp/docs/3.14/, etc.) by using wildcards.
    • Allows crawling of the latest versions of docs at /docs/latest/ and /ja-jp/docs/latest/.

Checklist

The following is a best-effort checklist. If any items in this checklist are not applicable to this PR or are dependent on other, unmerged PRs, please still mark the checkboxes after you have read and understood each item.

  • I have updated the side navigation as necessary.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have updated the documentation to reflect the changes.
  • Any remaining open issues linked to this PR are documented and up-to-date (Jira, GitHub, etc.).
  • My changes generate no new warnings.
  • Any dependent changes in other PRs have been merged and published.

Additional notes (optional)

N/A

@josh-wong josh-wong added the enhancement New feature or request label Apr 1, 2025
@josh-wong josh-wong self-assigned this Apr 1, 2025
@josh-wong josh-wong changed the title Add robots.txt to restrict crawling to version-specific docs Add robots.txt to restrict crawling of version-specific docs Apr 1, 2025
@josh-wong josh-wong merged commit de4c552 into main Apr 1, 2025
1 check passed
@josh-wong josh-wong deleted the add-robots.txt-file branch April 1, 2025 11:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant