Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Raise nofile ulimit for web/proxito containers #161

Closed
wants to merge 2 commits into from

Conversation

agjohnson
Copy link
Contributor

I'm hitting "too many open files" rather consistently (see #160 for a first try at resolving this). I am getting some inconsistent results still, so not sure if this solves the problem or not.

I was just able to start the development env without altering the reload mechanism though, so perhaps this just needs to be bumped up a bit.

I'm also not opposed to changing the reload mechanism too though.

Copy link
Member

@humitos humitos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have a strong opinion here. I've never hit this issue using the regular Docker environment or with the new templates 🤷🏼

It seems it's still uncertain why this is happening and how to reproduce it. So, I also don't know what is the best solution for this: increasing this limit or using nodemon for reload.

@agjohnson
Copy link
Contributor Author

This one should theoretically have no effect for others, it's just higher limits for the number of files open. I'd probably start here, and perhaps others should try this change out too.

Replacing the reload daemon is heavier of a change, and I can't explain why it helped. It might be a nicer overall experience (all the services restart using the same logic), but it doesn't seem as important to me.

@agjohnson
Copy link
Contributor Author

This doesn't seem to have solved the issue for me either, I'm still getting "too many open files".

@stsewd
Copy link
Member

stsewd commented Dec 20, 2022

I have had this problem in the past, I always just raise the limit at the OS level with a command from stack overflow p:

@agjohnson
Copy link
Contributor Author

@stsewd do you remember the command?

It's odd that this PR doesn't fix the issue for me. As far as I know, I am nowhere near the system limit:

% cat /proc/1/limits | grep "open files"
Max open files            1073741816           1073741816           files     
% sysctl fs.file-max
fs.file-max = 9223372036854775807

@stsewd
Copy link
Member

stsewd commented Dec 20, 2022

@agjohnson maybe this is a different error, but I remember using https://unix.stackexchange.com/questions/13751/kernel-inotify-watch-limit-reached#13757 to fix my problem.

@agjohnson
Copy link
Contributor Author

Ah interesting, seems related but not quite the errors I'm seeing.

@agjohnson
Copy link
Contributor Author

With nodemon I've seen these errors less, but I was still encountering a warning as the webpack container starts (repeated a few hundred times):

webpack_1   | npm WARN tar TAR_ENTRY_ERROR EMFILE: too many open files, close

I can reproduce this after a fresh boot, without anything like pipewire holding a lot of open files even. At least this container doesn't crash completely, so this is not blocking me.

I'm not using any custom configuration of my system -- docker is the system default, no sysctl configuration fixes, and the only fix I've left is setting the default higher at /etc/security/limits.conf.

Duplicating this fix in the webpack docker configuration removed these warnings for me, and the container started cleanly. So, I think this doesn't completely fix the issue, but does seem to help.

I'm hitting "too many open files" rather consistently (see #160 for a
first try at resolving this). I am getting some inconsistent results
still, so not sure if this solves the problem or not.

I was just able to start the development env without altering the reload
mechanism though, so perhaps this just needs to be bumped up a bit.

I'm also not opposed to changing the reload mechanism too though.
@humitos
Copy link
Member

humitos commented Sep 26, 2023

I'm closing this PR since we haven't had this issue anymore. Feel free to re-open, tho.

@humitos humitos closed this Sep 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants