A Ruby library to help write robots.txt compliant web robots
Ruby
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
lib
test
.document
.gitignore
.travis.yml
Gemfile Migrate from jeweler to bundle gem. Feb 14, 2013
LICENSE.txt
README.rdoc Hello 2016 Jan 4, 2016
Rakefile I'm getting tired. May 13, 2013
webrobots.gemspec

README.rdoc

webrobots

This is a library to help write robots.txt compliant web robots.

Usage

require 'webrobots'
require 'uri'
require 'net/http'

robots = WebRobots.new('MyBot/1.0')

uri = URI('http://digg.com/news/24hr')
if robots.disallowed?(uri)
  STDERR.puts "Access disallowed: #{uri}"
  exit 1
end
body = Net::HTTP.get(uri)
# ...

Requirements

  • Ruby 1.8.7 or 1.9.2+

Contributing to webrobots

  • Check out the latest master to make sure the feature hasn't been implemented or the bug hasn't been fixed yet

  • Check out the issue tracker to make sure someone already hasn't requested it and/or contributed it

  • Fork the project

  • Start a feature/bugfix branch

  • Commit and push until you are happy with your contribution

  • Make sure to add tests for it. This is important so I don't break it in a future version unintentionally.

  • Please try not to mess with the Rakefile, version, or history. If you want to have your own version, or is otherwise necessary, that is fine, but please isolate to its own commit so I can cherry-pick around it.

Copyright

Copyright © 2010-2016 Akinori MUSHA. See LICENSE.txt for further details.