Skip to content


Subversion checkout URL

You can clone with
Download ZIP
Tree: 01330ec25d
Fetching contributors…

Cannot retrieve contributors at this time

42 lines (33 sloc) 1.031 kB
__NAME__ purpose
specify user-agents that will NOT be classified as crawler bots (search engines)
__NAME__ see also
RobotUA, RobotIP
__NAME__ synopsis
<arg choice='plain' rep='repeat'><replaceable>useragent_string</replaceable></arg>
__NAME__ description
The &conf-NotRobotUA; directive defines a list of useragent strings which will
<emphasis role='bold'>never</emphasis> be classified as crawler robots
(search engines) visiting the site.
This directive has priority over &conf-RobotUA;.
If the user agent matches &conf-NotRobotUA;, then the check for
&conf-RobotUA; is not performed and the client is not treated
as an unattended &glos-robot;.
__NAME__ notes
For more details regarding web spiders/bots and &IC;, see
&glos-robot; glossary entry.
For more details regarding user sessions, see &glos-session; glossary
__NAME__ example: Defining __FILENAME__
Jump to Line
Something went wrong with that request. Please try again.