Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Speculation on how sub-googlebots work, based on public documentation #2
If we pass in a comma-separated list of two user agents (more specific first, e.g.
First should be the specific user agent, and second should be the general user agent whose ruleset the crawler should obey iif there are no specific rules targeting the first user agent
Expose tuple functionality in the interface to robots and call from the wrapper if two user agents passed in the string comma-separated. Add test cases.
One of the ways listed in the open source project for compiling and testing the project is using bazel - which creates directories that should be ignored by git as we never want to check them in