-
Notifications
You must be signed in to change notification settings - Fork 571
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
too many open files ? #983
Comments
Are there any zombie processes? If no, check the current ulimit and play around with it. |
Hello, I have the same problem. After a few days I see this in the logs
As soon as this happens all requests will fail until I reboot the container. I do not see any zombie processes
|
|
I'm also having this problem. |
I updated my open files limit with
Will give feedback in a few days if the problem happens again. |
I still the same problem. Error: Error solving the challenge. [Errno 24] Too many open files no zombie processes ps -aux | grep Z
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 650961 0.0 0.1 3324 1536 pts/1 S+ 16:34 0:00 grep Z
root@prowlarr:~# no limits: ulimit cat /proc/sys/fs/file-max
9223372036854775807
root@prowlarr:~# ulimit -Hn
1048576
root@prowlarr:~# ulimit -Sn
1024
root@prowlarr:~# cat /proc/sys/fs/file-max
9223372036854775807 Only way is to restart the service I created with :
any idea how I can get more info to debug ? |
See if v3.3.11 makes a difference. |
to fix just add
in file This is because Python stores the process metadata even after it terminates |
If those above can test and confirm this is working, I'll add it in the next release. |
Which version can we install to test this fix ? |
Run from source - https://github.com/FlareSolverr/FlareSolverr#from-source-code Or if using Docker, edit the file in the container and restart. |
For now I have installed 3.3.12 because I'm in a lightweight headless LXC container running flaresolverr binary distribution and I don't want to pollute this container by installing chrome and all it's dependencies just to run the source version of flaresolverr. I might try in the future in another container. Anyway if you think the process should be closed at that point why not doing process.close() ? Maybe you have a reason, I did not look at the source code yet ? |
As I can't reproduce the issue I can't test if this resolves it. |
I adapted the code as suggested. As this problem only occurs after a few days I'll give you a feedback when relevant (did the change yesterday). |
Easy way to reproduce - use windows version, because of 512 MOF in windows, just send 515 requests. |
It's original undetected_cromedriver package, which rarely updates |
There are a few modifications, but yes, 99.9% of UC is untouched. |
I also change the code as suggested déc 22 on version 3.3.12 normaly the error occur 4/5 days after container start and never reach 1 week of uptime so i will update you next week if it solve |
Got TMOF, but time to error enlarged from 1 day to 2 days with ~ same amount of requests per day |
After 6 days, not a single failure :). It usually started failing after 2, 3 days for me. |
Hi the team, do you know when the app will be updated? I have a Docker version and I don't quite understand how to change the parameter to overcome this problem ^^' thks U |
@LaCartouche how are your instance of Flaresolver working? Did you have any issues? |
I just took a second look at my logs. It is all working BUT I think Yggtorrent disabled their captcha verification. From the logs: 2023-12-27 12:34:53 INFO Challenge not detected! If there is no captcha to solve I don't know if the undetected_chromedriver is used or not (I guess not ?). Which would then explain why I have no issues anymore and invalidate my testing. |
Any time the container is redeployed, this would need to be repeated. |
I still have the "too many open files" issue even with the suggested fix. |
Here's the log of my last test with a container that was just restarted and with the suggested fix applied. 335 requests until the "OSError: [Errno 24] Too many open files" |
iDope, TorrentQQ, and Torrent[CORE] are all public indexers with Cloudflare challenges, you could add them, configure them with Sonarr/Radarr/whatever you're using, see if the issue continues. |
Added one with a challenge i will keep you updated |
ok Now 6 Days with this fix and still up with challenge solved every 30 min |
ok issue happend today again but after 7 days compared to the usual 4 /5days so around 50% improvement Process Process-336: |
ok i have run lsof in the container and im flooded with this for maybe a thousand line python 7 flaresolverr 11u IPv4 4261517 0t0 TCP flaresolverr:8191->10.10.7.4:40246 (CLOSE_WAIT) |
Additional fix in file p = Popen([executable, *args], stdin=PIPE, stdout=PIPE, stderr=PIPE, **kwargs) to p = Popen([executable, *args], stdin=None, stdout=None, stderr=None, **kwargs) this have no impact on workflow, but reduce pipes usage, which usage lead to TMOF |
You know the drill people, test and let me know ;) |
Tried this new suggestion and started 2000 calls. I still have the same problem. Started from a fresh container, I had around 50 open files at startup. After running my test I'm stuck at 5197 open files.
|
My slow brain realized I had to switch to the python user... It's actually a variation of this line: python 7 169 python flaresolverr 877u IPv4 4359744 0t0 TCP localhost:46236->localhost:54065 (CLOSE_WAIT) |
ok found that the owner of all those file seam to be flaresolverr.py and this is the process 7 hope it help to narrow the issue |
ok cut 1/3 of those file got rid of those : python 7 flaresolverr 13u IPv4 7465067 0t0 TCP localhost:42260->localhost:48683 (CLOSE_WAIT) so now only have those i think its the fact to kill trhe detached browser from the outside that create this |
Please test proposed fix - ultrafunkamsterdam/undetected-chromedriver#1812. |
it work made the change run a bunch of solving test and I'm not spammed by open file Thanks |
If someone else can confirm this as well (@LaCartouche maybe?), then I'll add it. |
I can confirm after modifying the init.py file with the two lines suggested. I no longer have any "too many open files" errors. many thanks |
Have you checked our README?
Have you followed our Troubleshooting?
Is there already an issue for your problem?
Have you checked the discussions?
Environment
Description
after a few days of correct functionning, I get the error "internal server error" with "Too many open files"
Logged Error Messages
Screenshots
No response
The text was updated successfully, but these errors were encountered: