Join GitHub today
GitHub is home to over 20 million developers working together to host and review code, manage projects, and build software together.
Unable to start new VPC cluster (0.95.1) #372
I'm experiencing an issue on 0.95.1 (also happened on 0.95). This does not occur on 0.94.3.
Then it just hangs in that state for a long time.
After I ^C out of this, listclusters shows the cluster as started with only master node up but no slave.
Here's the config.
Attempting to do sshmaster also hangs. Then comes back with:
System Log from AWS Console again shows the following:
Is there anything else I can do to help figure this out?
Quick update after troubleshooting with Justin on IRC for others who might have this issue:
One workaround is to stop assigning the public IP. The instance will then be reachable from the VPC but not from outside. For that, add PUBLIC_IPS=False to the config (there is a commandline equivalent too).
See http://star.mit.edu/cluster/docs/latest/manual/configuration.html#using-the-virtual-private-cloud-vpc for details on VPC including StarCluster's defaults and command line options.