robots.txt edits #3884
robots.txt edits #3884
Conversation
Current coverage is 85.83%@@ master #3884 diff @@
==========================================
Files 144 144
Lines 8566 8566
Methods 0 0
Messages 0 0
Branches 1136 1136
==========================================
Hits 7353 7353
Misses 977 977
Partials 236 236
|
Disallow: /*docs*$vote | ||
Disallow: /*docs.json | ||
Disallow: /*preview-wiki-content | ||
Disallow: /*docs/ckeditor_config.js | ||
Disallow: /*feed* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This rule needs to be more precise as it will also block articles like https://developer.mozilla.org/en-US/Firefox/Releases/2/Adding_feed_readers_to_Firefox
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good 👀
Thanks for putting it in alphabetical order! A few rules need to be made more precise. |
Disallow: /*search* | ||
Disallow: /skins | ||
Disallow: /*type=feed | ||
Disallow: /*users*signin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to block everything under users according to the spreadsheet.
@stephaniehobson thanks for the review; should be ready for re-review |
This works well for our regular documents and is an improvement over what we have so I'm going to merge. 👍 When I compared some of the test URLs to the file though I discovered that all the @dchukhin can you do a bit of research to see if the end of the URL matching syntax I saw mentioned on one blog is well supported and, if so, submit another PR using it with the Example:
Thanks. |
I have looked around and based on here and here it seems that some crawlers (Google, and a few other major ones) support using a |
@dchukhin Thanks for doing that research. Google is what we're most concerned about. I've done a little more thinking on this and I think we can block zones by making the current rules less specific. Could you please change the rules looking for |
https://tree.taiga.io/project/viya-mdn-durable-team/task/59
This pull request follows up on the request make in https://docs.google.com/spreadsheets/d/1X-YLmIg8vVvDWlShLF-361Cz2KcUBnlaPXBHLMaYEaQ/edit#gid=0 to add several more lines to robots.txt.
@stephaniehobson