Skip to content

No robots.txt #111

@FliegendeWurst

Description

@FliegendeWurst

It would be good to provide a /robots.txt for (search engine) crawlers. Currently https://bin.bloerg.net/robots.txt returns a 404.

At least /burn/:id URLs should be blocked, I think.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions