Be notified of new releases
Create your free GitHub account today to subscribe to this repository for new releases and build software alongside 40 million developers.Sign up
- added composer.json file (needs for packagist.org)
- move all code to src/RobotsTxtParser folder
- move tests to tests/RobotsTxtParser folder
- fix code style to PSR compatible format
- improve RobotsTxtValidator (getRelativeUrl function)
Fixed parsing error when the latest allow/disallow directive in the block is empty and the next directive is User-agent.
User-agent: * Disallow: User-agent: Linguee Disallow: /api/showcase
URL "/api/showcase" should be assigned (as Disallow) only to User-agent Linguee, not any.
See PR #54 for more information.
- use only English language for README.md (#45)
- added more phpunit tests for RobotsTxtParser, RobotsTxtValidator (#44, #49)
- fixed bug in RobotsTxtParser with processing of list of user-agents (#47)
User-Agent: ahrefs User-Agent: SurdotlyBot Disallow: /
Now will be disallowed path "
/" for both user-agents (earlier
Disallow: / has been applied only for the latest agent SurdotlyBot)
4. fixed bug with handle character '
+' in allow,disallow directives in RobotsTxtValidator (#50)
Main note: refactoring of RobotxTxtParser class to significantly improve performance of parsing a large robots.txt files (100-1000 times) due to improvement of an algorithm of parsing.
- refactored RobotxTxtParser
- save full backward compatibility with previous version
- all phpunit tests passed