Simple robots.txt file serving for web applications
Switch branches/tags
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
simple_robots
.gitignore
.travis.yml
LICENSE
README.md
manage.py
setup.py

README.md

django-simple-robots

Build Status pypi release

Most web applications shouldn't be indexed by Google. This app just provides a view that serves a "deny all" robots.txt.

In some cases, you do want your app to be indexed - but only in your production environment (not any staging environments). For this case, you can set ROBOTS_ALLOW_HOST. If the incoming hostname matches this setting, an "allow all" robots.txt will be served. Otherwise, the "deny all" will be served.

Tested against Django 1.8, 1.9, 1.10, 1.11 on Python 2.7, 3.4 and 3.6

Installation

Install from PIP

pip install django-simple-robots

In your root urlconf, add an entry as follows:

from django.conf.urls import url
from simple_robots.views import serve_robots

urlpatterns = [
    url(r'robots.txt', serve_robots),
    # ..... other stuff
]

Optionally, set ROBOTS_ALLOW_HOST in your settings.py

ROBOTS_ALLOW_HOST = "myproductionurl.com"

That's it!

Code of conduct

For guidelines regarding the code of conduct when contributing to this repository please review https://www.dabapps.com/open-source/code-of-conduct/