Allow, at least, the following to be configured per-archive:
additional URLs to disallow
crawl delay
Our repository has some additional URLs that shouldn't be harvested by robots and the only way of updating robots.txt is to replace the dynamic one with a static one in the static folder.
The text was updated successfully, but these errors were encountered:
Allow, at least, the following to be configured per-archive:
Our repository has some additional URLs that shouldn't be harvested by robots and the only way of updating robots.txt is to replace the dynamic one with a static one in the static folder.
The text was updated successfully, but these errors were encountered: