Skip to content
This repository has been archived by the owner. It is now read-only.

Any way we can add a robots.txt to avoid Graylog instances getting indexed on Google? #1151

Closed
neilferreira opened this issue Mar 5, 2015 · 2 comments
Milestone

Comments

@neilferreira
Copy link

@neilferreira neilferreira commented Mar 5, 2015

As the title suggests, is there any way we can do this? I'm running Graylog on port 80 on my server. I guess I could switch to using another port or putting it behind a reverse proxy that serves up a robots.txt, but that seems like overkill!

joschi pushed a commit that referenced this issue Mar 5, 2015
@joschi joschi closed this in ea1a4f1 Mar 5, 2015
@joschi
Copy link
Contributor

@joschi joschi commented Mar 5, 2015

It's not possible to add a robots.txt retroactively in an existing graylog-web-interface release, but a default will be included in the next releases.

@joschi joschi self-assigned this Mar 5, 2015
@joschi joschi added this to the 1.0.1 milestone Mar 5, 2015
@joschi
Copy link
Contributor

@joschi joschi commented Mar 5, 2015

By the way, you probably shouldn't expose your Graylog web interface to the Internet at all or without an encrypted transport (HTTPS).

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants