-
Notifications
You must be signed in to change notification settings - Fork 862
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multi-host cluster, Apache Spark Cluster and Production Ready #29
Comments
|
Thanks @kiwenlau. For your first response, I think setting up docker swarm will work, however my team and I are yet to test this. Will update you when this is done. We will also test docker volume and provide our feedback soon. Thanks again |
Any updated guys?. I would like to use HDFS on multi-host cluster i.e., docker running on each host machine and share data among themselves... |
@kiwenlau I don't know why only with your docker code, the nodemanager in running in slaves. If i use your components of code in my docker file, it's not working(nodemanager in not running in slaves)... |
I tried this out on my local and it was fantastic, 5 node cluster. However, I want to also set up a spark cluster on the docker images as well as a multi-host cluster for high availability.
E.g have a 5 node cluster per physical host for 3 physical hosts and have them communicate with each other.
Also want to know if this image is production ready.
Thank you
The text was updated successfully, but these errors were encountered: