You can clone with
Cannot retrieve contributors at this time
The robots.txt file.Search engines, when looking for sites to show in search results, will firstlook for the file /robots.txt. If this file is found and has lines that applyto them they will do as instructed. A very basic robots.txt follow as anexample:# go awayUser-agent: *Disallow: /This tells every search engine that cares (User-agent: *) to not index the site(Disallow everything past /).If you have installed Koha to /usr/local/koha3 then this file would be placedin the directory /usr/local/koha3/opac/htdocs/. This should prevent searchengines from browsing every biblio record, and every view of each record, onyour Koha install periodically.