New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any way we can add a robots.txt to avoid Graylog instances getting indexed on Google? #1151

Closed
neilferreira opened this Issue Mar 5, 2015 · 2 comments

Comments

Projects
None yet
2 participants
@neilferreira

neilferreira commented Mar 5, 2015

As the title suggests, is there any way we can do this? I'm running Graylog on port 80 on my server. I guess I could switch to using another port or putting it behind a reverse proxy that serves up a robots.txt, but that seems like overkill!

joschi added a commit that referenced this issue Mar 5, 2015

@joschi joschi closed this in ea1a4f1 Mar 5, 2015

@joschi

This comment has been minimized.

Contributor

joschi commented Mar 5, 2015

It's not possible to add a robots.txt retroactively in an existing graylog-web-interface release, but a default will be included in the next releases.

@joschi joschi self-assigned this Mar 5, 2015

@joschi joschi added this to the 1.0.1 milestone Mar 5, 2015

@joschi

This comment has been minimized.

Contributor

joschi commented Mar 5, 2015

By the way, you probably shouldn't expose your Graylog web interface to the Internet at all or without an encrypted transport (HTTPS).

edmundoa added a commit that referenced this issue Mar 5, 2015

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment