Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Robots.txt auto generation on install #1222

Closed
macik opened this issue Jul 2, 2013 · 9 comments
Closed

Robots.txt auto generation on install #1222

macik opened this issue Jul 2, 2013 · 9 comments

Comments

@macik
Copy link
Member

macik commented Jul 2, 2013

Thus sitemap directive added into robots.txt (see 7523f28 ) we have ability to autocreate (if write permition granted) robots.txt with certain site url during install. So no addition (hand made) changes required to robots.txt (except uncomment sitemap line).

@seditio
Copy link
Member

seditio commented Jul 2, 2013

Good point. But a more important thing is the Host directive.

@Dayver
Copy link
Member

Dayver commented Jul 2, 2013

But only Yandex

@macik
Copy link
Member Author

macik commented Jul 2, 2013

Yes its Yandex related feature, not supported by others.

@macik
Copy link
Member Author

macik commented Jul 2, 2013

But we can use somekind of this variant:

# Default Cototnti exclusions
User-agent: *
Disallow: /datas
Disallow: /images
Disallow: /js
Disallow: /lang
Disallow: /lib
Disallow: /themes
Disallow: /system

# Yandex related directives
User-agent: Yandex
Disallow: /datas
Disallow: /images
Disallow: /js
Disallow: /lang
Disallow: /lib
Disallow: /themes
Disallow: /system
Host: your-domain.com

# For sitemaps.xml autodiscovery. Uncomment if you have one:
# Sitemap: http://your-domain.com/sitemap.xml

@trustmaster
Copy link
Member

This requires setting a proper CHMOD on robots.txt file which isn't very good for 2 reasons: it adds one more step to the installation process and it isn't very good for security.

@macik
Copy link
Member Author

macik commented Oct 6, 2013

No special actions needed. Just one string help these people had writable root by default (eg. on sharing hosting it's a common practice).
@file_put_contents("\n./.htaccess", array('# Sitemap: '.$cfg['mainurl']."\n"),FILE_APPEND);

@Kilandor
Copy link
Member

Such a limited scope of things needing to modify robots.txt I would instead propose that simply on installing sitemap plugin. The attempt to modify robots.txt can be made during the installation of the plugin, the plugin handling it itself. It could also check and warn about modification if need be.

@macik
Copy link
Member Author

macik commented Jan 12, 2014

Good point, thanks.

Alex300 added a commit that referenced this issue Jan 6, 2015
@Alex300
Copy link
Member

Alex300 commented Jan 6, 2015

Added automatic creation of directives Sitemap and Host

@Alex300 Alex300 closed this as completed Jan 6, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants