Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to exclude pages by content #199

Open
GoogleCodeExporter opened this issue Mar 17, 2015 · 0 comments
Open

Add ability to exclude pages by content #199

GoogleCodeExporter opened this issue Mar 17, 2015 · 0 comments

Comments

@GoogleCodeExporter
Copy link

Hi. It's a feature request

-X is not enough in some cases. I am testing a site which always responds 200 
OK on any request. Skipfish is quite good at pseudo-404-pages recognition, but 
in this case it fails. It thinks some of pages are real pages, but actually 
it's just 404 stub with code 200. And for some reason it doesn't crawl links 
even on main page (probably its signature matches one of the pseudo-404 
signatures found by skipfish earlier or idk why).

So it would be nice to have ability to provide regex for exluding such pages by 
content (like arachni does). I'm not sure about performance but slower anyway 
better than nothing.

Original issue reported on code.google.com by maxxa...@gmail.com on 29 Nov 2013 at 10:14

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant