Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0 datanode(s) running #10

Open
icissy opened this issue Aug 20, 2017 · 1 comment
Open

0 datanode(s) running #10

icissy opened this issue Aug 20, 2017 · 1 comment

Comments

@icissy
Copy link

icissy commented Aug 20, 2017

执行docker-compose exec spark-master jar cv0f /code/spark-libs.jar -C /root/spark/jars/ .
报错:
put: File /user/spark/share/lib/spark-libs.jar.COPYING could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.

@ruoyu-chen
Copy link
Owner

出现这种错误,最大的可能就是HDFS没有正常启动起来,有可能是你第一次运行集群忘记格式化HDFS了;如果已经不是第一次运行集群了,那就重启一下HDFS。具体原因要看datanode的日志

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants