Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

stdout during event_callback cannot be used #13

Closed
ghost opened this issue Feb 28, 2014 · 5 comments
Closed

stdout during event_callback cannot be used #13

ghost opened this issue Feb 28, 2014 · 5 comments

Comments

@ghost
Copy link

ghost commented Feb 28, 2014

I have tried to run a background scan with:

arg='--send-ip -PE -PS21,22,23,25,53,80,110,111,139,443,445,5357 -PA80 -PP'
nmapscanner = NmapProcess(targets='10.1.0.80-100', options=arg, event_callback=lambda nmapscanner: callback(nmapscanner))
nmapscanner.run_background()

my callback function just try to parse the nmapscanner.stdout but it's impossible, the output is not xml well formed.
I have tried to use ElementTree and your NmapParser and they raise the following errors:
Wrong XML structure: cannot parse data (NmapParser)
no element found: line 4, column 0 (ElementTree)

So there is no way to access to datas discovered during the scan (for example if I want to know which host have been found alive before the end of the scan), the only way to access datas: stdout, isn't usable (or only with advanced regex).

I'm not sure this is an issue, but python-nmap permits to access to datas during the scan, so I guess there should be a method to do that here too.

Thanks for your work, it's a great lib

@savon-noir
Copy link
Owner

yep, I didn't implemented a partial xml block parser. This could be achieved with a pull-dom type parser but I haven't. The basic idea is already there but I only did it partially to collect the percentage done but no other events (see def __process_event in process.py).

If you fee like implementing this, do not hesitate, fork and issue a pull request, I would be happy to merge such feature but I will probably not have the time to do this myself for the moment.

@ghost
Copy link
Author

ghost commented Mar 16, 2014

Hi, in fact, the workaround that I use is hard adding at the
end of the partial xml returned, this work for me. I'm not sure this is
the solution that you expect ^^.
Thanks for your work

| David SORIA | Security Engineer
| ITrust France | IT Security Services & SaaS Editor
| Email: d.soria@itrust.fr | Fixe 05.67.34.67.87
| Bâtiment ACTYS/1 55 l'Occitane 31670 LABEGE

On 16/03/2014 12:32, Ronald wrote:

yep, I didn't implemented a partial xml block parser. This could be
achieved with a pull-dom type parser but I haven't. The basic idea is
already there but I only did it partially to collect the percentage
done but no other events (see def __process_event in process.py).

If you fee like implementing this, do not hesitate, fork and issue a
pull request, I would be happy to merge such feature but I will
probably not have the time to do this myself for the moment.


Reply to this email directly or view it on GitHub
#13 (comment).

@savon-noir
Copy link
Owner

Hey,

I'll add an attribute to parse() method "incomplete" which will basically enable you to parse interrupted or not yet finished scan.

I've tested you raw style method of appending a at the end of the stream and it seems to work (although the best would be to add another xml stream based parser engine, but I know, i'll never do this :p)

If, from your experience, you have improved the proposed method, do not hesitate to issue a pull request or provide me with some comments :)

feedback is always good.

Cheers!

Ronald

@savon-noir
Copy link
Owner

added in version 0.4.5 in parse.py, see commit: 12e9ad3

@ghost
Copy link
Author

ghost commented Apr 7, 2014

Hi,

I have not improved this method, it was sufficient for all my test cases.
But in fact, I have to switch to python-nmap because of CPU consumption.

You are using python thread, but this is not as performant as
muti-processing:
Sometimes, it takes 90% of one processor.
With python-nmap: less than 1%

So you should probably consider to use multi-processing.

Thanks for your reactivity

Good job

| David SORIA | Security Engineer
| ITrust France | IT Security Services & SaaS Editor
| Email: d.soria@itrust.fr | Fixe 05.67.34.67.87
| Bâtiment ACTYS/1 55 l'Occitane 31670 LABEGE

On 06/04/2014 13:23, Ronald wrote:

Hey,

I'll add an attribute to parse() method "incomplete" which will
basically enable you to parse interrupted or not yet finished scan.

I've tested you raw style method of appending a at the end of the
stream and it seems to work (although the best would be to add another
xml stream based parser engine, but I know, i'll never do this :p)

If, from your experience, you have improved the proposed method, do
not hesitate to issue a pull request or provide me with some comments :)

feedback is always good.


Reply to this email directly or view it on GitHub
#13 (comment).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant