Skip to content

feat: block /debug route from search engine indexing#3815

Merged
MarkusNeusinger merged 5 commits intomainfrom
claude/fix-debug-tracking-robots-LQy6E
Jan 12, 2026
Merged

feat: block /debug route from search engine indexing#3815
MarkusNeusinger merged 5 commits intomainfrom
claude/fix-debug-tracking-robots-LQy6E

Conversation

@MarkusNeusinger
Copy link
Copy Markdown
Owner

Adds Disallow: /debug to robots.txt to prevent search engines
from crawling and indexing the internal debug dashboard.

This follows SEO best practices for internal admin/debug tools.

Adds Disallow: /debug to robots.txt to prevent search engines
from crawling and indexing the internal debug dashboard.

This follows SEO best practices for internal admin/debug tools.
Adds GET /robots.txt endpoint that blocks all crawlers with Disallow: /.
APIs should not be indexed by search engines.

Changes:
- Add robots.txt endpoint in api/routers/seo.py
- Add comprehensive tests (unit, integration, e2e)
- Update docs/reference/seo.md with robots.txt documentation
- Social media bots remain unaffected (they fetch og:images directly)

Follows best practices for API SEO management.
Copilot AI review requested due to automatic review settings January 12, 2026 17:22
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds robots.txt functionality to both the frontend and backend to control search engine crawling. The frontend blocks only the /debug route while the backend API blocks all routes from search engine indexing (following best practices for APIs).

Changes:

  • Added Disallow: /debug to frontend robots.txt to prevent search engine indexing of debug dashboard
  • Added backend API endpoint GET /robots.txt that blocks all API routes from search engines
  • Added comprehensive test coverage (unit, integration, and e2e tests)
  • Updated SEO documentation with detailed robots.txt configuration for both frontend and backend

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
app/public/robots.txt Added Disallow: /debug to block debug route from indexing while allowing all other routes
api/routers/seo.py Added new GET /robots.txt endpoint that blocks all API routes (Disallow: /)
docs/reference/seo.md Added comprehensive documentation section for robots.txt configuration, explaining frontend vs backend differences
tests/unit/api/test_routers.py Added unit test for backend robots.txt endpoint
tests/integration/api/test_api_endpoints.py Added integration test for backend robots.txt endpoint
tests/e2e/test_api_postgres.py Added e2e test for backend robots.txt endpoint with real PostgreSQL

Comment thread tests/unit/api/test_routers.py Outdated
Comment thread tests/integration/api/test_api_endpoints.py Outdated
Comment thread tests/e2e/test_api_postgres.py Outdated
@codecov
Copy link
Copy Markdown

codecov bot commented Jan 12, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Copilot AI review requested due to automatic review settings January 12, 2026 18:31
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 6 out of 6 changed files in this pull request and generated no new comments.

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Copilot AI review requested due to automatic review settings January 12, 2026 19:32
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@MarkusNeusinger MarkusNeusinger merged commit 155f2a1 into main Jan 12, 2026
6 checks passed
@MarkusNeusinger MarkusNeusinger deleted the claude/fix-debug-tracking-robots-LQy6E branch January 12, 2026 19:36
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 6 out of 6 changed files in this pull request and generated 1 comment.

"""Tests for SEO router."""

def test_robots_txt(self, client: TestClient) -> None:
"""robots.txt should block crawlers from all routes."""
Copy link

Copilot AI Jan 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test docstring claims "robots.txt should block crawlers from all routes" but this test is only validating the backend API endpoint behavior (which does block all routes with "Disallow: /"). However, the PR title and description state the purpose is to "block /debug route from search engine indexing", which refers to the frontend robots.txt file that only blocks /debug.

This test doesn't validate the frontend robots.txt file at all (which is a static file at app/public/robots.txt). Consider either:

  1. Updating the docstring to clarify this tests the backend API robots.txt endpoint specifically, or
  2. Adding a separate test for the frontend robots.txt file if that's feasible in your test setup.
Suggested change
"""robots.txt should block crawlers from all routes."""
"""Backend /robots.txt endpoint should block crawlers from all routes."""

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants