Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Too many open files" exception when running from Docker #18022

Closed
fauria opened this issue Apr 27, 2016 · 8 comments
Closed

"Too many open files" exception when running from Docker #18022

fauria opened this issue Apr 27, 2016 · 8 comments

Comments

@fauria
Copy link

fauria commented Apr 27, 2016

Elasticsearch version: 2.3.2 (Dockerfile elasticsearch:latest)

JVM version: openjdk version "1.8.0_72-internal"

OS version: Docker image: https://hub.docker.com/_/elasticsearch/

Description of the problem including expected versus actual behavior:
Expected behavior: An elasticsearch docker container successfully started.
Actual behavior: An elasticsearch docker container fails to start.

Steps to reproduce:

  1. Run docker pull elasticsearch
  2. Run docker run -i -t --rm -v /srv/elasticseach/data:/usr/share/elasticsearch/data -v /srv/elasticseach/config:/usr/share/elasticsearch/config elasticsearch
  3. A java.nio.file.FileSystemException is raised with message "Too many open files".

Provide logs (if relevant):
http://pastebin.com/raw/kf0LDGxD

@jasontedor
Copy link
Member

The problem is your max file descriptors are too low. There's even a warning in your logs:

[2016-04-27 19:08:27,576][WARN ][env ] [Box IV] max file descriptors [4096] for elasticsearch process likely too low, consider increasing to at least [65536]

@fauria
Copy link
Author

fauria commented Apr 28, 2016

Im running elasticsearch from the official docker image: https://hub.docker.com/_/elasticsearch/

On my host environment, both hard and soft limits are greater than 65536:

ulimit -Hn && ulimit -Sn

98304
98304

However inside the container:

docker run -i -t --rm elasticsearch bash
ulimit -Hn && ulimit -Sn

4096
1024

@jasontedor
Copy link
Member

There is not an official Docker image that is affiliated with Elastic. To be clear, the "official" Docker image on Docker Hub is not affiliated with Elastic.

@fauria
Copy link
Author

fauria commented Apr 28, 2016

I would never have thought that... Already submitted to docker-library/elasticsearch#102 thank you!

@fauria
Copy link
Author

fauria commented Apr 29, 2016

Fixed!

If anyone has the same problem, try running the container with --ulimit nofile=98304:98304command line option, i.e.:

docker run -d --ulimit nofile=98304:98304 --name elasticsearch ...

@hcm007
Copy link

hcm007 commented Nov 1, 2017

there is still same problem, when I use docker stack deploy -c in swarm @fauria

@ralyodio
Copy link

how do i do this with docker-compose?

@eugenweissbart
Copy link

Hi @chovy, I know it's probably too late, but the official docs on docker-compose files say it's pretty straightforward:

ulimits:
  nofile:
    soft: 98304
    hard: 98304

This should be done in your desired service's section.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants