Skip to content

Commit

Permalink
Merge pull request #7909 from dasdachs/add-robots-txt
Browse files Browse the repository at this point in the history
Removed robots.txt and added an extension to layout.html
  • Loading branch information
bsipocz committed Oct 17, 2018
1 parent d576e39 commit 71bfbaf
Show file tree
Hide file tree
Showing 3 changed files with 29 additions and 11 deletions.
25 changes: 25 additions & 0 deletions docs/_templates/layout.html
@@ -0,0 +1,25 @@
{# This extension of the 'layout.html' prevents documentation for previous
versions of Astropy to be indexed by bots, e.g. googlebot or bing bot,
by inserting a robots meta tag into pages that are not in the stable or
latest branch.

It assumes that the documentation is built by and hosted on readthedocs.org:
1. Readthedocs.org has a global robots.txt and no option for a custom one.
2. The readthedocs app passes additional variables to the template context,
one of them being `version_slug`. This variable is a string computed from
the tags of the branches that are selected to be built. It can be 'latest',
'stable' or even a unique stringified version number.

For more information, please refer to:
https://github.com/astropy/astropy/pull/7874
http://www.robotstxt.org/meta.html
https://github.com/rtfd/readthedocs.org/blob/master/readthedocs/builds/version_slug.py
#}

{% extends "!layout.html" %}
{%- block extrahead %}
{% if not version_slug in to_be_indexed %}
<meta name="robots" content="noindex, nofollow">
{% endif %}
{{ super() }}
{% endblock %}
9 changes: 4 additions & 5 deletions docs/conf.py
Expand Up @@ -170,6 +170,10 @@
# Output file base name for HTML help builder.
htmlhelp_basename = project + 'doc'

# A dictionary of values to pass into the template engine’s context for all pages.
html_context = {
'to_be_indexed': ['stable', 'latest']
}

# -- Options for LaTeX output --------------------------------------------------

Expand Down Expand Up @@ -252,8 +256,3 @@ def setup(app):
'to this.')

linkcheck_anchors = False

# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
html_extra_path = ['robots.txt']
6 changes: 0 additions & 6 deletions docs/robots.txt

This file was deleted.

0 comments on commit 71bfbaf

Please sign in to comment.