Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[14.0][ADD] website_no_crawler_environment: initial commit #891

Conversation

AshishHirapara
Copy link

By default, /robots.txt on an Odoo installation with a website module installed will allow indexing by web crawlers.
This module will overwrite the view that generates /robots.txt and won't allow index based on the server environment.

<xpath expr="t" position="replace">
<t t-translation="off">
User-agent: *
<t t-if="server_env in ('preprod', 'stage')">Disallow: /</t>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this way you're limiting user to have preprod and stage env only. Better to generalize it, having test as default one.

@GSLabIt
Copy link
Contributor

GSLabIt commented May 16, 2022

@AshishHirapara Better to move to https://github.com/OCA/server-env repo

Comment on lines +8 to +10
Sitemap:
<t t-esc="url_root" />
sitemap.xml
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where is sitemap.xml?

What are the advantages of this module over using X-Robots-Tag?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Tardo I think the main advantage of this module is that users don't need to have knowledge of 'developer-specific' and complicated things like the X-Robots-Tag, and many users don't even know how to use the X-Robots-Tag within the odoo, so the module will be useful for them.
Also, this module provides a way to overwrite the view that generates /robots.txt and won't allow to index based on the server environment, so it will be easy to use if users have multiple environments and just restrict some of the environments from crawlers. So this is the main purpose of this module.

And if you think this can be done in a better way with X-Robots-Tag, then I am open to your suggestions. Let me know.
Thanks.

@github-actions
Copy link

There hasn't been any activity on this pull request in the past 4 months, so it has been marked as stale and it will be closed automatically if no further activity occurs in the next 30 days.
If you want this PR to never become stale, please ask a PSC member to apply the "no stale" label.

@github-actions github-actions bot added the stale PR/Issue without recent activity, it'll be soon closed automatically. label Mar 26, 2023
@github-actions github-actions bot closed this Apr 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale PR/Issue without recent activity, it'll be soon closed automatically.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants