Skip to content

feat(seo): add production robots.txt with AI crawler visibility#144

Merged
marcel-veselka merged 3 commits intomainfrom
fix/82-robots-txt-ai-crawlers
Mar 17, 2026
Merged

feat(seo): add production robots.txt with AI crawler visibility#144
marcel-veselka merged 3 commits intomainfrom
fix/82-robots-txt-ai-crawlers

Conversation

@marcel-veselka
Copy link
Copy Markdown
Member

Summary

  • Replace staging Disallow: / robots.txt with full production version allowing all crawlers
  • Add explicit Allow: / rules for major AI crawlers: GPTBot, ClaudeBot, Google-Extended, PerplexityBot, Applebot-Extended, Amazonbot, YouBot, cohere-ai
  • Include Sitemap: https://wopee.io/sitemap.xml pointer
  • Fix deploy.yml: swap domain in robots.txt instead of deleting it during website-prod deploy

Why

Closes part of #82 (AI search setup). The old robots.txt blocked all crawlers (staging default). AI crawlers need explicit allow rules to index the site in AI-powered search engines (ChatGPT, Perplexity, Claude, etc.).

Test plan

  • Check https://wopee.io/robots.txt after deploy — should show Allow rules, not Disallow: /
  • Verify Sitemap line points to https://wopee.io/sitemap.xml

- Replace staging Disallow:/ with full production robots.txt allowing all AI crawlers
- Add explicit rules for GPTBot, ClaudeBot, Google-Extended, PerplexityBot, Applebot-Extended, Amazonbot, YouBot, cohere-ai
- Include Sitemap pointer to https://wopee.io/sitemap.xml
- Fix deploy.yml: swap domain in robots.txt instead of deleting it (website-prod step)
@marcel-veselka marcel-veselka merged commit 4b155fa into main Mar 17, 2026
1 check passed
@marcel-veselka marcel-veselka deleted the fix/82-robots-txt-ai-crawlers branch March 17, 2026 20:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant