-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
During batch feed, OSError: [Errno 24] Too many open files #389
Comments
This seems to be an OS limitation. I was once running a script on an AWS EC2 instance and I had to increase the open file soft limit when using I increased the limit with If the above does not work, try to reduce the number of async connections via the
|
I didn't change ulimit, but tried reducing the number of connections like you suggested and it worked! The default number of connections (100) was fine for a local docker deployment. Had to reduce it to ingest the same data into Vespa Cloud. Thanks for your help. |
Glad it worked. Did you check what the highest value that worked was? I might consider changing the default |
Hi @thigm85 , connections set to 100 and 50 didn't work for me, but setting it to 20 worked. I dont know if it's the max value that would work though. Cheers! |
Thanks @neo-anderson |
I tried |
@lesters put this in the doc somewhere and/or change default? |
Yes, great! |
Hi 👋
When I try to ingest data into Vespa Cloud, I get this error -
OSError: [Errno 24] Too many open files
.When I select only the first few documents in my dataset, the feed works. If I use the whole dataset, I get that error. I dont see a way to reset the connections/close files. So, pyvespa wont let me upload anymore data unless I quit the python session and do it all over again. Synchronous batch feed works, but it is too slow for my usecase.
Code:
The text was updated successfully, but these errors were encountered: