Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Browse files

Removed robots.txt

I/O Docs are crawler friendly, so let them be crawled. If you'd rather
not be crawled, then throw a standard all-blocking robots.txt file in
the ./public/ directory.
  • Loading branch information...
commit 29e1c6ff5a9b1d555f9727b9ea9326fccbada382 1 parent 9a9d038
@mansilladev mansilladev authored
Showing with 0 additions and 2 deletions.
  1. +0 −2  public/robots.txt
View
2  public/robots.txt
@@ -1,2 +0,0 @@
-User-agent: *
-Disallow: /
Please sign in to comment.
Something went wrong with that request. Please try again.