Skip to content

Conversation

@lazyguru
Copy link
Contributor

Not sure if others will find this useful or not but it helps me out as I tend to leave ngrok running for long periods of time and sometimes get hit by a webcrawler (or 2).

This will add location entries for /robots.txt to have it always return:

User-Agent: *
Disallow: /

without needing to remember to add a default robots.txt

@molovo
Copy link

molovo commented Sep 12, 2018

If we go the hardcoding route it would probably be better to set a header on every request, rather than relying on robots.txt:
X-Robots-Tag: noindex, nofollow, nosnippet, noarchive

@drbyte
Copy link
Contributor

drbyte commented Oct 4, 2019

As far as protecting Ngrok sessions, the following combines both proposals:

server {
    listen 127.0.0.1:60;
    server_name foo.test www.foo.test *.foo.test;
    root /;
    charset utf-8;
    client_max_body_size 128M;
+    add_header X-Robots-Tag 'noindex, nofollow, nosnippet, noarchive';

    location /41c270e4-5535-4daa-b23e-c269744c2f45/ {
        internal;
        alias /;
        try_files $uri $uri/;
    }

in here:

server {
listen 127.0.0.1:60;
server_name VALET_SITE www.VALET_SITE *.VALET_SITE;
root /;
charset utf-8;
client_max_body_size 128M;
location /VALET_STATIC_PREFIX/ {
internal;
alias /;
try_files $uri $uri/;
}

drbyte added a commit to drbyte/valet that referenced this pull request Nov 30, 2019
Replaces and closes laravel#575 ... for reasons described there.
@lazyguru lazyguru deleted the hardcode-robots-txt branch December 2, 2019 03:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants