Skip to content
This repository has been archived by the owner on Sep 17, 2019. It is now read-only.

Elasticsearch container will crash under default Docker for Mac configuration #6

Closed
dustinrue opened this issue Mar 2, 2017 · 16 comments

Comments

@dustinrue
Copy link
Contributor

dustinrue commented Mar 2, 2017

The default Docker for Mac (and presumably Windows as well) configuration limits Docker to 2GB of memory. The default heap size for Elasticsearch is also 2GB which means the Elasticsearch container will immediately exit with a potentially confusing "error 137" (it's killed due to out of memory) as soon as it is interacted with, particularly with Cerebro.

Suggestion is to either note that you must configure Docker for Mac/Windows to allow for more than 2GB of memory in the readme or limit the heap size in Elasticsearch with:

environment:
          ES_JAVA_OPTS: "-Xms750m -Xmx750m"

Or add it to the existing config file.

@khorolets
Copy link

Hi there! Can't confirm, I've just ran this compose on two macs with the limitations of 1gb RAM for docker. Haven't seen mentioned error.

@dustinrue
Copy link
Contributor Author

dustinrue commented Mar 9, 2017

When I allow 1GB of memory for Docker I get a crash loop

screenshot 2017-03-09 14 34 17

I'm using Docker 17.03.0-ce-mac2 (15657)

@khorolets
Copy link

I'll provide screenshots in my next morning, in 12 hours. BTW I am not arguing, just saying that I can't reproduce it on two machines:)

@dustinrue
Copy link
Contributor Author

Completely understand. What we'll find is one of us has a difference our setup that will help find the true cause.

@khorolets
Copy link

As I promised. The same config on the second machine. This one is on Sierra and the second one is on El Capitan. I think the issue isn't about memory.

wp-docker_macos

@pjrola
Copy link

pjrola commented Nov 8, 2017

This post helped a ton. I was running some other things on docker like Jenkins and MySql in combination with Elastic Search and Kibana. Pretty sure I hit the limit. I upped my memory to 4gb no issues now. Thanks

@123avi
Copy link

123avi commented Jan 31, 2018

@pjrola thank you so much, that really saved me :) I also updated the memory on the docker engine (docker menu -> preference) and that did the trick !

@kiranreddyd
Copy link

Thanks a bunch! Helped me save so much time!

@dnuttle
Copy link

dnuttle commented Jan 24, 2019

I have seen an issue where Elasticsearch crashes when I start Logstash. I saw an OutOfMemoryError once and adjusted the heap size (1536m) and the Docker limit (3g). Now when I start Logstash, Elasticsearch just abruptly dies; there is nothing in the log at all. Oddly, just a few times I've gotten it to work (the Logstash conf file is very simple, just sends stdin to Elasticsearch), so I can't fathom what's happening. I also see that Logstash hogs a lot of the CPU while it's starting.

@Javetz
Copy link

Javetz commented Feb 13, 2019

Update memory used to 4Gigs in Advanced Settings tab.

@niemyjski
Copy link

This has not worked for me. I'm constantly getting crashes.

@kadnan
Copy link

kadnan commented May 14, 2019

This post helped me too. I am running Cassandra and the other node was keep getting exited. I increased the RAM and it worked.

@AlekKras
Copy link

This post helped me a lot with a completely different project. I didn't know about the memory problem. Saved me a ton of time! Thank you, @dustinrue !

@philliphartin
Copy link

I've just been troubleshooting why an Elasticsearch based project wouldn't work and this was exactly the issue! Thanks for highlighting this.

@mystredesign
Copy link

mystredesign commented Aug 19, 2019

I upped my memory to be 7.5Gb 16Gb and still get the error... Does it really need this much memory?
I don't think this is resolved...
Update: I changed my swap space to 4GB and memory to 4GB and it seems to be working now... Still

@dustinrue
Copy link
Contributor Author

dustinrue commented Aug 19, 2019

@mystredesign the point is to reduce the amount of memory the JVM is allowed to use in relation to the amount of memory Docker is allowed to use. If you reduce the amount of memory the JVM is allowed to use and you still have issues then it'll be based on the number of docs you are trying to index.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests