New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shodan search fix #15267
Shodan search fix #15267
Conversation
Thanks for your pull request! Before this pull request can be merged, it must pass the checks of our automated linting tools. We use Rubocop and msftidy to ensure the quality of our code. This can be ran from the root directory of Metasploit:
You can automate most of these changes with the
Please update your branch after these have been made, and reach out if you have any problems. |
This is a replacement for #15229. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was able to reproduce the original issue by setting the QUERY
to html:tchat
and MAXPAGES
to 20
. After that I was able to reproduce that this fixes it by ensuring the maximum number of pages that are processed is the lesser value of the total number of pages and the user specified maximum.
I left a couple of comments that would help simplify some of the code. Consistently using a page index (starting from 0) I think would clean things up quite a bit. The shodan_query
function can just add one to it because shodan starts their numbering at 1.
p = 1 | ||
if results[0]['total'] > 100 | ||
page = 2 | ||
while p < maxpage | ||
results[p] = shodan_query(apikey, query, page) | ||
if results[p]['matches'].nil? | ||
next | ||
else | ||
p += 1 | ||
page += 1 | ||
end | ||
end | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since at this point, maxpages
has been set to the lower value between the user-specified option and the total pages, this could be safely reduced to:
p = 1 | |
if results[0]['total'] > 100 | |
page = 2 | |
while p < maxpage | |
results[p] = shodan_query(apikey, query, page) | |
if results[p]['matches'].nil? | |
next | |
else | |
p += 1 | |
page += 1 | |
end | |
end | |
end | |
maxpages.times do |page| | |
page_results = shodan_query(apikey, query, page + 1) | |
break if page_results['matches'].nil? | |
results[page] = page_results | |
end |
Drop the page + 1
in the shodan_query
API call if you implemented the other feedback to treat the parameter as a page index.
while page < maxpage | ||
results[page]['matches'].each do |host| |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of defining and then manually incrementing page
here, this could just operate on results.each
. With the other changes I suggested, there won't be any page results where page_results['matches']
is nil.
That would make this something like:
results.each do |page_results|
At that point you don't even need page
, don't have to define it, don't have to increment it.
Got it, I'm going to work on that.
May 26, 2021 8:46:08 PM Spencer McIntyre ***@***.***>:
… ***@***.**** requested changes on this pull request.
I was able to reproduce the original issue by setting the *QUERY* to *html:tchat* and *MAXPAGES* to *20*. After that I was able to reproduce that this fixes it by ensuring the maximum number of pages that are processed is the lesser value of the total number of pages and the user specified maximum.
I left a couple of comments that would help simplify some of the code. Consistently using a page index (starting from 0) I think would clean things up quite a bit. The *shodan_query* function can just add one to it because shodan starts their numbering at 1.
----------------------------------------
In modules/auxiliary/gather/shodan_search.rb[#15267 (comment)]:
> + p = 1
+ if results[0]['total'] > 100
+ page = 2
+ while p < maxpage
+ results[p] = shodan_query(apikey, query, page)
+ if results[p]['matches'].nil?
+ next
+ else
+ p += 1
+ page += 1
+ end
end
end
Since at this point, *maxpages* has been set to the lower value between the user-specified option and the total pages, this could be safely reduced to:
⬇️ Suggested change
- p = 1
- if results[0]['total'] > 100
- page = 2
- while p < maxpage
- results[p] = shodan_query(apikey, query, page)
- if results[p]['matches'].nil?
- next
- else
- p += 1
- page += 1
- end
- end
- end
+ maxpages.times do |page|
+ page_results = shodan_query(apikey, query, page + 1)
+ break if page_results['matches'].nil?
+
+ results[page] = page_results
+ end
Drop the *page + 1* in the *shodan_query* API call if you implemented the other feedback to treat the parameter as a page index.
----------------------------------------
In modules/auxiliary/gather/shodan_search.rb[#15267 (comment)]:
> + while page < maxpage
+ results[page]['matches'].each do |host|
Instead of defining and then manually incrementing *page* here, this could just operate on *results.each*. With the other changes I suggested, there won't be any page results where *page_results['matches']* is nil.
That would make this something like:
results.each do |page_results|
At that point you don't even need *page*, don't have to define it, don't have to increment it.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub[#15267 (review)], or unsubscribe[https://github.com/notifications/unsubscribe-auth/AHDG6X5BPV5WD5KIRCSN67TTPU6W7ANCNFSM45SPOFYQ].
[data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEgAAABICAYAAABV7bNHAAAAAXNSR0IArs4c6QAAAARzQklUCAgICHwIZIgAAAArSURBVHic7cEBDQAAAMKg909tDjegAAAAAAAAAAAAAAAAAAAAAAAAAAA+DFFIAAEctgHwAAAAAElFTkSuQmCC###24x24:true###][Tracking image][https://github.com/notifications/beacon/AHDG6XYC3KSU65EG3KPHIV3TPU6W7A5CNFSM45SPOFY2YY3PNVWWK3TUL52HS4DFWFIHK3DMKJSXC5LFON2FEZLWNFSXPKTDN5WW2ZLOORPWSZGOE7TDUZY.gif]
|
Getting the results seems a bit random and if the first query is an error then it ends, cannot do otherwise since the error could be a proper one that is unresolvable by just repeating. If we get the first result ok then there is no reason to not retry infinitely each pages, I notice it often return with the error "error" |
I did modify almost everything you pointed out, but for the looping style of the queries on page > 1. |
You're right, it only needs to fetch additional pages when there are more than 100 results, and the way I had proposed would fetch the first page again. It's fine as is now, thanks for implementing the other suggestion! I'll give this another test now and approve the unit tests to run again. |
I cleaned up the commit history by squashing everything down into one commit. Once the unit tests pass again (which they should because they passed previously and I didn't change any of the code), I'll get this merged in. Thanks for finding and then fixing this bug! 🎉 |
Release NotesFixed a bug that was present within the Shodan search module, where certain queries would cause an exception to be raised while processing the results. |
Here it is as a branch.
Fixes are:
-will retry to get pages (other than 1st) when search yields an error,
when not doing so, the results array can have a blank page and the call to results[page]['matches'].each will at this point produce a fatal error.
-the first call to shodan_query would set the results[1] intead of results[0], this has no impact but is not right.