Skip to content

Adding a robots.txt to stop search engines from crawling yui.github.com/yui2/ #1

Closed
wants to merge 2 commits into from

2 participants

@triptych

We don't want search engines to crawl outdated docs. After we turn off docs on YDN we want the YUI3 docs to rise to the top of the results. This robots.txt file helps prevent old YUI 2.x docs from showing up in search results.

@triptych

/cc @ericf Need this to prevent yui.github.io/yui2 from being crawled by search engines

@ericf
YUI Library member
ericf commented Jul 8, 2013

I don't think this is a good idea. If people are still using YUI 2, then we want them to be able to find the documentation. I we prefer that we solve this problem by placing a banner across the top of all the archived docs HTML pages.

@triptych
triptych commented Jul 8, 2013

Ok, btw the banner across the top of the archived docs will be ready to go soon. yui/yui2#14

@triptych triptych closed this Jul 8, 2013
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.