Environment specific robots.txt for your Rails Apps
Ruby HTML JavaScript Shell CSS
Switch branches/tags
Permalink
Failed to load latest commit information.
app ensure Rails < 5 works with old render symbol Jun 30, 2017
bin
config
lib bump patch Jun 30, 2017
script environment specific robots.txt w/ fallback Apr 21, 2012
spec
.gitignore Feature Roboto generator:install May 30, 2012
.rspec
.rvmrc
.travis.yml upgrade ruby versions for travis build Jun 29, 2017
.yardopts
Gemfile
Gemfile.rails4-0 update rails 4 gemfile Jun 29, 2017
Gemfile.rails4-0.lock update rails 4 gemfile Jun 29, 2017
Guardfile
LICENSE initial commit Apr 20, 2012
MIT-LICENSE
README.md
Rakefile documentation Apr 21, 2012
roboto.gemspec explicitly include rspec expectations Jun 29, 2017

README.md

Roboto

Build Status

Roboto is a Rails Engine that gives you the ability to specify enviornment specific robots in your Rails 4.2+ application.

Don't let crawlers access your staging environment. This is bad for SEO.

Installing

You can add it to your Gemfile with:

gem 'roboto'

After you need to run the generator:

#>  rails generate roboto:install

If you already have robots.txt, it will be kept for your production environment in config/robots/production.txt

You can now specify environment specific robots.txt files in config/robots/. By default crawlers are disallow from accessing your site has been made for all your environments.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Added some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request