New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too many open files #46

Closed
digininja opened this Issue May 21, 2014 · 23 comments

Comments

Projects
None yet
2 participants
@digininja
Copy link

digininja commented May 21, 2014

Got this while it was creating the report after scanning 46 URLs from an nmap XML file.

[*] ERROR: Web page possibly blank or SSL error!
Traceback (most recent call last):
File "./EyeWitness.py", line 1549, in
IOError: [Errno 24] Too many open files: '/home/robin/reports/xx/screenshots_named/report.html'

Then I tried to run it again and got this at URL 37:

Attempting to capture: http://xxx.com:80 (37/46)

(process:7534): GLib-ERROR **: Creating pipes for GWakeup: Too many open files

Trace/breakpoint trap

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

I've ran into your second error when I've provided 1000s of URLs. What I think is happening is Ghost seems to have issues at times properly closing network connections to websites it is taking a screenshot of. So, when it has a ton of URLs I've had it hit the file descriptor limit, and then kill the process (EyeWitness). I unfortunately can't seem to re-create it consistently on my end. Sometimes, when given thousands of URLs, it'll work perfectly. Other times, I hit that same GWakeup error. I currently have an open issue with the developer of Ghost, but it's been open for a while and I haven't had a response.

I'm actually working on porting EyeWitness to ruby. It's mostly a thought exercise for me, but I also have a different library (selenium) that I'm going to use for the screenshots themselves, so I'm hoping that it won't encounter the same issues.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

One thing I did think was odd, is that you hit that error after 37 URLs. I've never seen that error with so few sites. I'll see if I can somehow get that to happen on my end.

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

I've got it consistently, what do you want me to do to debug it?

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

Is it always a single URL that it's failing on? While it's running, before it crashes (and as close to the crash as you can get it), can you run a lsof -p and see the number of open file descriptors?

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Always on number 37.

Grabbing lsof dump now, will email over.

On 21 May 2014 15:38, ChrisTruncer notifications@github.com wrote:

Is it always a single URL that it's failing on? While it's running, before
it crashes (and as close to the crash as you can get it), can you run a
lsof -p and see the number of open file descriptors?


Reply to this email directly or view it on GitHubhttps://github.com//issues/46#issuecomment-43762592
.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

Ok. Thanks. So if you remove that URL from the list, does it run without issue? I just added in the --createtargets flag to have EyeWitness dump URLs from nessus or nmap xml files. You can do that with:
./EyeWitness.py -f nmap.xml --createtargets

It'll make a file (target_servers.txt) containing all the web servers from the nmap file. So that might make it easy to remove that one URL, and see if it's still failing, or it runs to completion.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

Had to update that last comment since github didn't like the "<>" tags

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Trying that and opening new issue as well :)

On 21 May 2014 15:45, ChrisTruncer notifications@github.com wrote:

Had to update that last comment since github didn't like the "<>" tags


Reply to this email directly or view it on GitHubhttps://github.com//issues/46#issuecomment-43763561
.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

Ha, right as I was typing that, I saw that one coming..

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Still crashes on number 37 but now with seg fault and no other message.

Going to try with 37 removed

On 21 May 2014 15:50, ChrisTruncer notifications@github.com wrote:

Ha, right as I was typing that, I saw that one coming..


Reply to this email directly or view it on GitHubhttps://github.com//issues/46#issuecomment-43764278
.

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Worked fine moving 37 to 46, now trying it in the middle

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

This is really interesting. If it's isolated to single site, then that helps somewhat know where the issue lies, but I'd also expect you to hit those issues by the end of looping through all URLs, not specifically being tied to that position (37).

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Moved it to the last position and it worked, moved it to position 39 and fails again on that URL with the original Trace/breakpoint error.

The site is up, valid, working and is slow but not more than a few seconds.

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Moved to position 41 and got the trackback error from the first ticket.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

If you give it that URL in a file by itself, does it still crash? How about also with --single?

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Works fine in single.

Have you got the lsof output I mailed you?

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

Yeah, that definitely looks like a file descriptor leak is causing the problem. Do your number of file descriptors get that high when scanning all sites besides that one?

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

Looks like it, sending you a lsof now. With that site removed it runs
through fine bit file descriptors still seem high.

On 21 May 2014 16:30, ChrisTruncer notifications@github.com wrote:

Yeah, that definitely looks like a file descriptor leak is causing the
problem. Do your number of file descriptors get that high when scanning all
sites besides that one?


Reply to this email directly or view it on GitHubhttps://github.com//issues/46#issuecomment-43771088
.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 21, 2014

Yeah, it does still look high. Really odd that it's so high with so few sites. I'll run some tests here and see if I get that high as well. Except for that one URL, you hopefully shouldn't be hitting the fd limit. I'm pretty sure it's a leak within Ghost though. I'll try to get in touch with the author again. I've looked into the library's code itself, and haven't been able to find/fix the leak.

I'll also do some tests on the POC for ruby and see where those file descriptors are number wise.

@digininja

This comment has been minimized.

Copy link

digininja commented May 21, 2014

done for now, you need anything else let me know and will look later.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented May 22, 2014

I ran through the list of URLs, and it's not crashing on my end. Oddly enough, when I'm also looking at the list of open file descriptors, I'm only at approximately 95 by the end of the list of URLs, so it's not anywhere near that 1000 default limit. This is really intriguing to me.

On another note, when testing with the ruby poc, at least grabbing the screenshot and saving it to disk, I'm not even going above 10 open file descriptors, so this might be the better way. I'll work through your open issues, then I'll prioritize working on the ruby implementation, since it seems to be a little bit easier on the OS. Although, this is also my first time learning Ruby, so will be interesting and fun!

@digininja

This comment has been minimized.

Copy link

digininja commented May 22, 2014

In case it helps, I'm on Debian Testing running Python 2.7.6

On 22 May 2014 02:57, ChrisTruncer notifications@github.com wrote:

I ran through the list of URLs, and it's not crashing on my end. Oddly
enough, when I'm also looking at the list of open file descriptors, I'm
only at approximately 95 by the end of the list of URLs, so it's not
anywhere near that 1000 default limit. This is really intriguing to me.

On another note, when testing with the ruby poc, at least grabbing the
screenshot and saving it to disk, I'm not even going above 10 open file
descriptors, so this might be the better way. I'll work through your open
issues, then I'll prioritize working on the ruby implementation, since it
seems to be a little bit easier on the OS. Although, this is also my first
time learning Ruby, so will be interesting and fun!


Reply to this email directly or view it on GitHubhttps://github.com//issues/46#issuecomment-43840818
.

@ChrisTruncer

This comment has been minimized.

Copy link
Member

ChrisTruncer commented Jul 19, 2014

I don't believe I am going to have a patch for this until this is fixed within Ghost. Unfortunately, they haven't responded to my open issue for quite some time.

However, I did release the ruby version, which should not (I believe) have this issue.

I'm going to go ahead and close this until I hear something back from the Ghost devs, then this can be re-opened or recreated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment