Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Commits on Oct 3, 2012
  1. Bump all packages that use perl, or depend on a p5-* package, or

    wiz authored
    are called p5-*.
    
    I hope that's all of them.
Commits on Apr 22, 2012
  1. Update to 6.02:

    wiz authored
    2012-02-18 WWW-RobotRules 6.02
    
    Restore perl-5.8.1 compatiblity.
Commits on Aug 12, 2011
  1. Not CONFLICTS with p5-libwww>=6.x.

    obache authored
    Bump PKGREVISION.
Commits on Aug 7, 2011
Commits on Jul 14, 2011
  1. add CONFLICTS (thanks wiz@ for the reminder)

    spz authored
    add a missing LICENSE and fix a few COMMENT
Commits on Jul 10, 2011
  1. The Perl 5 module WWW::RobotRules parses /robots.txt files as specified

    spz authored
    in "A Standard for Robot Exclusion", at
    http://www.robotstxt.org/wc/norobots.htmls
    Webmasters can use the /robots.txt file to forbid conforming robots
    from accessing parts of their web site.
    
    The parsed files are kept in a WWW::RobotRules object, and this object
    provides methods to check if access to a given URL is prohibited.
    The same WWW::RobotRules object can be used for one or more parsed
    /robots.txt files on any number of hosts.
Something went wrong with that request. Please try again.