Skip to content
This repository

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

A Ruby library to help write robots.txt compliant web robots

branch: master

Fetching latest commit…

Octocat-spinner-32-eaf2f5

Cannot retrieve the latest commit at this time

Octocat-spinner-32 lib
Octocat-spinner-32 test
Octocat-spinner-32 .document
Octocat-spinner-32 .gitignore
Octocat-spinner-32 .travis.yml
Octocat-spinner-32 Gemfile
Octocat-spinner-32 LICENSE.txt
Octocat-spinner-32 README.rdoc
Octocat-spinner-32 Rakefile
Octocat-spinner-32 webrobots.gemspec
README.rdoc

webrobots

This is a library to help write robots.txt compliant web robots.

Usage

require 'webrobots'
require 'uri'
require 'net/http'

robots = WebRobots.new('MyBot/1.0')

uri = URI('http://digg.com/news/24hr')
if robots.disallowed?(uri)
  STDERR.puts "Access disallowed: #{uri}"
  exit 1
end
body = Net::HTTP.get(uri)
# ...

Requirements

  • Ruby 1.8.7 or 1.9.2+

Contributing to webrobots

  • Check out the latest master to make sure the feature hasn't been implemented or the bug hasn't been fixed yet

  • Check out the issue tracker to make sure someone already hasn't requested it and/or contributed it

  • Fork the project

  • Start a feature/bugfix branch

  • Commit and push until you are happy with your contribution

  • Make sure to add tests for it. This is important so I don't break it in a future version unintentionally.

  • Please try not to mess with the Rakefile, version, or history. If you want to have your own version, or is otherwise necessary, that is fine, but please isolate to its own commit so I can cherry-pick around it.

Copyright

Copyright © 2010, 2011, 2012, 2013 Akinori MUSHA. See LICENSE.txt for further details.

Something went wrong with that request. Please try again.