Skip to content


Repository files navigation


A simple gem to parse X-Robots-Tag HTTP headers according to Google X-Robots-Tag HTTP header specifications.

CircleCI Code Climate Test Coverage


Add this line to your application's Gemfile:

gem 'robots_tag_parser', git: ''

And then execute:

$ bundle


Basic examples

Get rules applying to all user agents:

headers = { 'X-Robots-Tag' => ['noindex,noarchive', 'googlebot: nofollow'] }

RobotsTagParser.get_rules(headers: headers)
=> ['noindex', 'noarchive']

Get rules applying to specific user agents (which include generic rules):

headers = { 'X-Robots-Tag' => ['noindex,noarchive', 'googlebot: nofollow'] }

RobotsTagParser.get_rules(headers: headers, user_agent: 'googlebot')
=> ['noindex', 'noarchive', 'nofollow']


After checking out the repo, run bin/setup to install dependencies. Then, run rake spec to run the tests. You can also run bin/console for an interactive prompt that will allow you to experiment.

To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and tags, and push the .gem file to


Bug reports and pull requests are welcome on GitHub.


The gem is available as open source under the terms of the CC0 License.


  • all - There are no restrictions for indexing or serving.
  • none - Equivalent to noindex and nofollow.
  • noindex - Do not show this page in search results and do not show a "Cached" link in search results.
  • nofollow - Do not follow the links on this page.
  • noarchive - Do not show a "Cached" link in search results.
  • nosnippet - Do not show a snippet in the search results for this page.
  • notranslate - Do not offer translation of this page in search results.
  • noimageindex - Do not index images on this page.
  • unavailable_after - Do not show this page in search results after the specified date/time.