Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Excessive CPU usage #77

Closed
Raywando opened this issue Oct 17, 2020 · 5 comments · Fixed by #96
Closed

[BUG] Excessive CPU usage #77

Raywando opened this issue Oct 17, 2020 · 5 comments · Fixed by #96
Labels
bug Something isn't working

Comments

@Raywando
Copy link

When the tool created multiple recursive jobs, at one point, it has output a lot of errors and the CPU usage got to 100% at my machine and it was finally killed.

I have even tried to lower the threads count from 50 to 20 but as the recursive jobs increase I think it didn't really matter

EXhr0WOfEJ

The best way in my opinion to avoid this is to add an argument that takes a number to set the most jobs running at once, and add the new jobs to queue.

Other than that, your tool is awesome, thank you for the efforts!

@Raywando Raywando added the bug Something isn't working label Oct 17, 2020
@epi052
Copy link
Owner

epi052 commented Oct 17, 2020

@Raywando Good morning, and thank you for the report! It looks like this is a target I won't be able to test against to reproduce (judging by your blocking of the domain), so I'd like to ask a few questions:

  1. What OS are you using?
  2. What is the output of ulimit -Sn and ulimit -Hn on your system?
  3. How many directories were being scanned when you took the screenshot above?

Looking at your screenshot, I see No file descriptors available which is why I'd like to see your system's open file limit with the ulimit commands. That particular problem can be solved by increasing the number of open files your OS allows. On my Kali install, the default was 1024, and I know some MacOS installs use 256 😕. There are a few options to increase the number of open files, but I'll show one here and link to more (linux assumed).

Edit /etc/security/limits.conf to include the two lines below. * is all users. hard and soft indicate the hard and soft limits for the OS. nofile is the number of open files option.

/etc/security/limits.conf
-------------------------
...
*        soft nofile 4096
*        hard nofile 8192
...

I'm not arguing that you saw 100% CPU usage. I also agree that limiting the number of recursive calls with a queue is likely the correct approach to resolving that particular situation. However, I do think you have two things going on at once, and would like to see what happens when we eliminate one of them.

If you're willing and able, I'd love to hear what happens when you increase your open file limit and rescan. Thanks again!

@epi052 epi052 changed the title [BUG] Excessive CPU usage [BUG] Excessive CPU usage / no file descriptors available Oct 17, 2020
@epi052
Copy link
Owner

epi052 commented Oct 17, 2020

I modified the title of the issue to drive folks here that have the no file descriptors available error message, since a solution to that is provided. I'll open an issue to update the readme with a section on open files here soon.

@Raywando
Copy link
Author

Raywando commented Oct 19, 2020

Hi @epi052, Thanks for the response.

Here is the information you asked for
WindowsTerminal_JRElf5wczt

And my machine is Linux - Ubuntu 18.04

Although I only got these errors when my CPU usage was %100 used, so I don't think there is anything else wrong. As for the domain, I can send it to you somehow privately if you want so you can test it yourself since it contains a lot of recursions (Maybe on Twitter. My handle: @Raywando).

Thanks again for the efforts!

@epi052
Copy link
Owner

epi052 commented Oct 19, 2020

@Raywando

I'd recommend upping your soft limit to something like 4096 by running ulimit -n 4096. CPU usage and open file exhaustion are two separate problems. I've personally had roughly 12 directories (default -t 50) being scanned concurrently without seeing the open file limit hit; after having set my limit to 4096 (also ubuntu). The exact number for the file limit will vary based on your exact setup and the scan performed, but 4096 is pretty reasonable in my opinion.

I have been watching the CPU usage during testing and still agree that it could use some tuning.

Try raising your soft limit and let me know how it goes. Thanks again!

@epi052 epi052 changed the title [BUG] Excessive CPU usage / no file descriptors available [BUG] Excessive CPU usage Oct 24, 2020
@epi052
Copy link
Owner

epi052 commented Oct 24, 2020

Removed no file descriptors available from the title, as the README has been updated with a section for that problem specifically.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants