Please sign in to comment.
Add robots.txt entries for provider's crawlers
Google recommends setting an empty robots.txt Disallow for their user agent, Mediapartners-Google for sites serving their ads. To prevent users from having to add these entries on their own, we can add them automatically. Each provider now has a $crawler_user_agent parameter, and the Disallows are filterable.
- Loading branch information...
Showing with 36 additions and 0 deletions.