Skip to content
Browse files

Removed robots.txt

I/O Docs are crawler friendly, so let them be crawled. If you'd rather
not be crawled, then throw a standard all-blocking robots.txt file in
the ./public/ directory.
  • Loading branch information...
1 parent 9a9d038 commit 29e1c6ff5a9b1d555f9727b9ea9326fccbada382 @mansilladev mansilladev committed Aug 5, 2011
Showing with 0 additions and 2 deletions.
  1. +0 −2 public/robots.txt
View
2 public/robots.txt
@@ -1,2 +0,0 @@
-User-agent: *
-Disallow: /

0 comments on commit 29e1c6f

Please sign in to comment.
Something went wrong with that request. Please try again.