Skip to content

Conversation

@rchlfryn
Copy link
Collaborator

@rchlfryn rchlfryn commented Feb 1, 2026

Description

When investigating #865, we received a lot of 500s from facebook crawlers. This likely wont fix all of our issues, but should help with verified crawler issues. In the same log, I also noticed failing requests for file paths like PDFs and images, which is why the rule is added as well.

Key Changes

Updated robots.txt

@github-actions
Copy link

github-actions bot commented Feb 1, 2026

Preview deployment: https://update-robots.preview.avy-fx.org

@rchlfryn rchlfryn added this pull request to the merge queue Feb 1, 2026
Merged via the queue into main with commit 7ce9994 Feb 1, 2026
8 checks passed
@rchlfryn rchlfryn deleted the update-robots branch February 1, 2026 21:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants