Skip to content
This repository

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Fetching contributors…

Cannot retrieve contributors at this time

file 18 lines (14 sloc) 0.673 kb
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
The robots.txt file.

Search engines, when looking for sites to show in search results, will first
look for the file /robots.txt. If this file is found and has lines that apply
to them they will do as instructed. A very basic robots.txt follow as an
example:

# go away
User-agent: *
Disallow: /

This tells every search engine that cares (User-agent: *) to not index the site
(Disallow everything past /).

If you have installed Koha to /usr/local/koha3 then this file would be placed
in the directory /usr/local/koha3/opac/htdocs/. This should prevent search
engines from browsing every biblio record, and every view of each record, on
your Koha install periodically.
Something went wrong with that request. Please try again.