Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix frontend robots.txt #4186

Merged
merged 3 commits into from Apr 24, 2024
Merged

Fix frontend robots.txt #4186

merged 3 commits into from Apr 24, 2024

Conversation

zackkrida
Copy link
Member

@zackkrida zackkrida commented Apr 23, 2024

Fixes

Fixes a regression introduced in #4077

Description

This PR returns to using a sever middleware for robots.txt in the Openverse frontend. It also adds robots.txt to the frontend gitignore to help prevent us from accidentally adding a static robots.txt file in the future.

It also adds Crawl-delay to slow down bing and other search engines when they crawl the pages we do allow them to crawl.

Testing Instructions

Run the frontend locally with just frontend/run dev and verify that you see the correct file contents as defined in /frontend/src/server-middleware/robots.js

Checklist

  • My pull request has a descriptive title (not a vague title likeUpdate index.md).
  • My pull request targets the default branch of the repository (main) or a parent feature branch.
  • My commit messages follow best practices.
  • My code follows the established code style of the repository.
  • I added or updated tests for the changes I made (if applicable).
  • I added or updated documentation (if applicable).
  • I tried running the project locally and verified that there are no visible errors.
  • I ran the DAG documentation generator (if applicable).

Developer Certificate of Origin

Developer Certificate of Origin
Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.


Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
    have the right to submit it under the open source license
    indicated in the file; or

(b) The contribution is based upon previous work that, to the best
    of my knowledge, is covered under an appropriate open source
    license and I have the right under that license to submit that
    work with modifications, whether created in whole or in part
    by me, under the same open source license (unless I am
    permitted to submit under a different license), as indicated
    in the file; or

(c) The contribution was provided directly to me by some other
    person who certified (a), (b) or (c) and I have not modified
    it.

(d) I understand and agree that this project and the contribution
    are public and that a record of the contribution (including all
    personal information I submit with it, including my sign-off) is
    maintained indefinitely and may be redistributed consistent with
    this project or the open source license(s) involved.

@zackkrida zackkrida requested review from a team as code owners April 23, 2024 20:26
@github-actions github-actions bot added the 🧱 stack: frontend Related to the Nuxt frontend label Apr 23, 2024
@openverse-bot openverse-bot added the 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work label Apr 23, 2024
@zackkrida zackkrida added 🟧 priority: high Stalls work on the project or its dependents 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work 🛠 goal: fix Bug fix 🕹 aspect: interface Concerns end-users' experience with the software and removed 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work labels Apr 23, 2024
Copy link
Contributor

@sarayourfriend sarayourfriend left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good work figuring out what changed with the traffic!

`
: `# Block crawlers from the staging site
: `# Block everyone from the staging site
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah! I suppose we should have a similar one for the staging API docs, then?

frontend/.gitignore Show resolved Hide resolved
@obulat obulat requested review from obulat and removed request for fcoveram April 24, 2024 04:33
@obulat obulat removed the 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work label Apr 24, 2024
Copy link
Contributor

@obulat obulat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! I left some non-blocking comments and questions inline.

Note on the testing instructions: you need to run DEPLOYMENT_ENV=production just frontend/run dev:only to see the production values, otherwise you'll see the "block everyone" robots.txt :)

@@ -1,5 +1,45 @@
const { LOCAL, PRODUCTION } = require("../constants/deploy-env")

const AI_ROBOTS_CONTENT = `
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be easier to maintain a list of user-agent names, and create the block string using it:

uaList.map(ua => `User-agent: ${ua}\nDisallow: /\n`).join("\n")

@@ -10,13 +50,17 @@ export default function robots(_, res) {
deployEnv === PRODUCTION
? `# Block search result pages
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
? `# Block search result pages
? `# Block search result pages and single result pages

Disallow: /search/audio/
Disallow: /search/image/
Disallow: /search/
Disallow: /image/
Disallow: /audio/

crawl-delay:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the purpose of crawl-delay with no value here? Could you add a comment?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo!

@zackkrida zackkrida merged commit f1477cc into main Apr 24, 2024
41 checks passed
@zackkrida zackkrida deleted the fix-frontend-robots branch April 24, 2024 11:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🕹 aspect: interface Concerns end-users' experience with the software 🛠 goal: fix Bug fix 🟧 priority: high Stalls work on the project or its dependents 🧱 stack: frontend Related to the Nuxt frontend
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

None yet

4 participants