-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mongodb reporting error #9
Comments
Yes, thats the MongoDB document limit. If it crosses 16MB limit, Mongo cannot save. So the next step cuckoo does, is to see if it can delete some key and then attempt to save it. But out of luck. So that particular analysis will not be saved into mongo. If JSON report was enabled, then you should have report.json inside of storage/analysis//reports. But yet wont be displayed in the UI. This happens sometimes, when you have lots of reporting stuff which exceeds the limit of the mongo document size. To counter this somewhat, the compress results was the solution. If you have pulled the latest from Kevin's repo, you can try to enable the compressresults in the reporting.conf file and restart cuckoo and try that sample again. Let us konw how that goes :) |
This already has the compressresults enabled. I'm just curious why the delete failed. Could the delete failure be handled more gracefully, so that just the offending results key is omitted from the results instead of the result failing completely. |
Hi enzok, I agree this failure should be handled more gracefully. I'll try and work out a way to do this - if you can share a sample hash please do. |
I modified mongodb.py with this code to remedy the issue (starting at ~ line 182):
Correct me if I'm wrong, but I don't believe that procdump results are being compressed. I think when there are too many yara strings, the results grow too large. |
Ah yes, I will look at adding compression to procdump output too, as well as implementing the fix you have kindly posted above. Thanks for your help. |
I have now pushed this fix and enabled compression for procdump. Please let me know if this fixes (or alleviates) this issue. |
Thank you. |
Will compressing the report results affect elasticsearch db (search only)? I noticed I'm now getting serialization errors when storing data into elasticsearch. |
Hmm possibly - I vaguely recall seeing problems previously with Elasticsearch and compression. Any chance you could provide some more details to help me try and narrow it down? |
It appears that that the compressed data doesn't serialize. I added the following code to the elasticsearchdb.py reporting module and it solved the issue. import json ~ line 137:
|
I would rather do it this way: elasticsearchdb.py compressresults.py This way compressresults will be done after elasticsearch is reported. |
That works for me. I completely forgot about being able to set the order. |
Ah fantastic - thanks both for finding and fixing this. I will make this change now. |
Add certutil into suspicious windows tools
Not sure what's going on here, any ideas?
2017-06-21 15:21:20,505 [modules.reporting.mongodb] WARNING: results['procdump']['yara'] deleted due to >16MB size (29MB)
2017-06-21 15:21:20,506 [lib.cuckoo.core.plugins] ERROR: Failed to run the reporting module "MongoDB":
Traceback (most recent call last):
File "/opt/cuckoo/lib/cuckoo/core/plugins.py", line 631, in process
current.run(self.results)
File "/opt/cuckoo/modules/reporting/mongodb.py", line 202, in run
del report[parent_key][child_key]
TypeError: list indices must be integers, not str
The text was updated successfully, but these errors were encountered: