Skip to content

knu/webrobots

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

webrobots

This is a library to help write robots.txt compliant web robots.

Usage

require 'webrobots'
require 'uri'
require 'net/http'

robots = WebRobots.new('MyBot/1.0')

uri = URI('http://digg.com/news/24hr')
if robots.disallowed?(uri)
  STDERR.puts "Access disallowed: #{uri}"
  exit 1
end
body = Net::HTTP.get(uri)
# ...

Requirements

  • Ruby 1.8.7 or 1.9.2+

Contributing to webrobots

  • Check out the latest master to make sure the feature hasn’t been implemented or the bug hasn’t been fixed yet

  • Check out the issue tracker to make sure someone already hasn’t requested it and/or contributed it

  • Fork the project

  • Start a feature/bugfix branch

  • Commit and push until you are happy with your contribution

  • Make sure to add tests for it. This is important so I don’t break it in a future version unintentionally.

  • Please try not to mess with the Rakefile, version, or history. If you want to have your own version, or is otherwise necessary, that is fine, but please isolate to its own commit so I can cherry-pick around it.

Copyright © 2010-2016 Akinori MUSHA. See LICENSE.txt for further details.

About

A Ruby library to help write robots.txt compliant web robots

Resources

License

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Languages