Middleware for snicco/http-routing
to disallow search engines
This middleware for the snicco/http-routing
component allows you
to discourage search-engines from indexing the current request path by using the X-Robots-Tag
header.
composer require snicco/no-robots-middleware
This middleware can be added globally, in a group or on a per-route basis. Choose what works best for you.
use Snicco\Middleware\NoRobots\NoRobots;
// Disallows robots entirely (noindex,no archive,nofollow)
$configurator->get('route1', '/route1', SomeController::class)
->middleware(NoRobots::class);
// noindex, no archive, nofollow header is not added because its set to false.
$configurator->get('route1', '/route1', SomeController::class)
->middleware(NoRobots::class. ':true,false,true');
This repository is a read-only split of the development repo of the Snicco project.
This is how you can contribute.
Please report issues in the Snicco monorepo.
If you discover a security vulnerability, please follow our disclosure procedure.