It would be good to provide a `/robots.txt` for (search engine) crawlers. Currently `https://bin.bloerg.net/robots.txt` returns a 404. At least `/burn/:id` URLs should be blocked, I think.
It would be good to provide a
/robots.txtfor (search engine) crawlers. Currentlyhttps://bin.bloerg.net/robots.txtreturns a 404.At least
/burn/:idURLs should be blocked, I think.