Dev and stage environments should not be listed on search engines.
Addons has a good way of doing this:
Sounds good! Rik, can you whip this into shape for Playdoh? The code itself should probably live in funfactory, and playdoh itself should just point to it. You might also not have to render a full-blown view like AMO does:
@kumar303 can you advise where in playdoh/funfactory Anthony should put this? Thanks!
Oh yeah, that should definitely be in playdoh. AMO checks the services domain so I think we can strip that out and just check ENGAGE_ROBOTS. Developers can set ENGAGE_ROBOTS=True on the public site and keep it False everywhere else.
Something like this?
diff --git a/project/settings/base.py b/project/settings/base.py
index 34c388e..cce6f85 100644
@@ -45,6 +45,11 @@ TEMPLATE_CONTEXT_PROCESSORS = list(TEMPLATE_CONTEXT_PROCESSORS) + [
+# Should robots.txt deny everything or disallow a calculated list of URLs we
+# don't want to be crawled? Default is false, disallow everything.
+# Also see http://www.google.com/support/webmasters/bin/answer.py?answer=93710
+ENGAGE_ROBOTS = False
ANON_ALWAYS = True
lambda r: HttpResponse(
"User-agent: *\n%s: /" % 'Allow' if settings.ENGAGE_ROBOTS else 'Disallow' ,
That looks good to me. It should also be set to True in settings/local.py-dist with a comment about it. The reason for that is IT likes the committed version of local.py-dist to as close to production as possible. This change might break the funfactory installer because it uses patch on settings files. You can try installing an app after applying the change locally to see.