Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When you scan a large number of websites, speed to slow down #1291

Open
vvsopk opened this issue Mar 21, 2023 · 1 comment
Open

When you scan a large number of websites, speed to slow down #1291

vvsopk opened this issue Mar 21, 2023 · 1 comment
Labels
question Further information is requested

Comments

@vvsopk
Copy link

vvsopk commented Mar 21, 2023

When I use this tool, I found that it couldn't output results to the file in real time, but the creation time of the output file was being updated in real time. I suspect that the tool stores the scanning results in memory, causing the scanning speed to slow down over time. Is it possible that I'm not using the correct commands?
image
it is slow
image
this is my commands
image

@vvsopk vvsopk added the question Further information is requested label Mar 21, 2023
@Prady18
Copy link

Prady18 commented Mar 23, 2023

@vvsopk
Without knowing more about the tool you are using and the specific commands you are running, it's difficult to say for sure what might be causing the issue you are experiencing. However, it is possible that the tool is storing the scanning results in memory, which could slow down the scanning speed over time and delay the output to the file.

One way to mitigate this issue is to use a command that flushes the output to the file after a certain amount of data has been processed. For example, in Python, you can use the flush() method to write data to a file immediately without waiting for the buffer to fill up. If you are using a different programming language or tool, there may be similar commands or settings that can help you achieve the same result.

Alternatively, you could try reducing the amount of data that the tool processes at one time, which may help to prevent the memory from becoming overwhelmed. This could be done by adjusting the tool's settings or by breaking up the input data into smaller chunks.

Overall, it's difficult to say exactly what might be causing the issue without more information, but hopefully these suggestions will help you to optimize the tool's performance and improve the output speed.

It looks like you may have accidentally cut off the end of your command there. However, assuming you are using the dirsearch.py tool to perform a directory search on a specific URL (-u option), there are a few things you can try to improve the performance:

1 Increase the number of threads: By default, dirsearch.py uses a single thread to perform the directory search. You can increase this number using the -t option, which specifies the number of threads to use. For example, you could try using -t 10 to use 10 threads simultaneously.

2 Use a smaller wordlist: The size of your wordlist can have a significant impact on the scanning speed. If your wordlist is very large, consider using a smaller one that is more focused on the target domain or application.

3 Use a faster internet connection: If you are scanning a remote target, your internet connection speed can also affect the scanning speed. Make sure you have a fast and stable connection to the target.

4 Use a more powerful computer: Depending on the size of your wordlist and the complexity of the target application, dirsearch.py may require a significant amount of computing power. Consider using a more powerful computer or upgrading your hardware if you are experiencing performance issues.

5 Use the --no-color option: If you are running dirsearch.py in a terminal or command prompt, the output may be slowed down by the use of colored text. You can disable color output using the --no-color option to speed up the scanning process.

You try thish ok

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants