You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Quick feature request. I like the fact that bwp-google-xml-sitemaps add multisite sitemaps to robots.txt if a physical file is not there, but at the same time I need to throttle Bing.
A simple text box with a snippet to add to all virtual sitemaps would go along way.
The text was updated successfully, but these errors were encountered:
Ok - in my case I have got about 200 sites running in a sub domain multisite setup. I kind of rely on the inclusion of the sitemapindex.xml in the robots.txt of the "base" site (no sub domain). One problem I am constantly facing is BingBot running amok. With regular intervals, bingbot deside to send 5-10 request per second to every single domain and that in turn make my server grind to a halt. There's two solutions to that problem - the first being blocking bingbot at least temporaril. But that is obviously not an attractive solution. The best solution is to include this in all sitemaps:
User-agent: bingbot
Crawl-delay: 5
I can do that by saving the auto generated robots.txt and editing it manually. But that means I have to remember to do that each time a new site is added and that happens almost daily.
So - the best solution would be to somehow be able to define some extra lines that will be included/appended to the generated robots.txt.
One way to do this would be a text box in the configuration. Alternatively could be the ability to define a file/path that would be appended to the robots.txt. Alternatively a hardcoded filename that would always be appended IF it existed (/path/to/site/root/robots.include).
Hi,
Quick feature request. I like the fact that bwp-google-xml-sitemaps add multisite sitemaps to robots.txt if a physical file is not there, but at the same time I need to throttle Bing.
A simple text box with a snippet to add to all virtual sitemaps would go along way.
The text was updated successfully, but these errors were encountered: