-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Closed
Labels
bugDid we break something?Did we break something?p1-importantImportant, aka current backlog of things to doImportant, aka current backlog of things to doresearch
Description
Version information
- DVC version: 0.58.1
- Platform: MacOS 10.14.6
- Method of installation: pip within a conda environment
Description
When pushing to S3 a directory of ~100 files that have been added to DVC, I observe an Errno 24 error from the dvc process.
It looks like dvc is trying to open more files than the OS allows. Checking the file handles on for the dvc process I get:
$ lsof -p $DVC_PID | wc -l
412
Looking at the OS limits, a process is limited to having 256 open files.
$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 256
pipe size (512 bytes, -p) 1
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 4256
virtual memory (kbytes, -v) unlimited
A workaround for this is to increase the max files per process to a larger number (say 4096) by running something like ulimit -n 4096, but I wonder if the ideal solution is for DVC to work within the OS configured limits by default?
Edit: Updated wording of workaround
Metadata
Metadata
Assignees
Labels
bugDid we break something?Did we break something?p1-importantImportant, aka current backlog of things to doImportant, aka current backlog of things to doresearch