-
Notifications
You must be signed in to change notification settings - Fork 559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Doubts on Hadoop Support #38
Comments
We use HDFS in the Hadoop distribution and you do not need to update your hadoop.
… On 20 Feb 2019, at 1:26 PM, Minghao Liu ***@***.***> wrote:
I read the quick start introduction and it says that you are using a Hadoop 2.9.2 which is a pretty high version. Is there any requirements for Hadoop(HBase Hive…) version? Also is it support Apache Hadoop only? I currently have a cluster running CDH5.7(Hadoop 2.6), need I update my Hadoop or change it to an original Apache Hadoop?
Looking forward to reply, thank you very much!
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub <#38>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AD20y7z8_0abkdH05C05kG8mOjKIlXMVks5vPNxrgaJpZM4bEb8x>.
|
|
I am sorry we did not list the environment requirements, but if you have checked the build guide you may found some necessary dependencies.
Here, i give a simple list, which may not be complete:
# if you run distributed and use HDFS as shared storage
- python
- pip
- TensorFlow
- HDFS library (need when you use hdfs)
- JDK
- zookeeper
# if you run in local mode
- python
- pip
- TensorFlow
# if you want to build from the source code, you may need build tools too, just follow the installation guide
euler is development with c++11, so make sure you c++ compiler support c++11 features
thanks
… On 20 Feb 2019, at 1:56 PM, Minghao Liu ***@***.***> wrote:
We use HDFS in the Hadoop distribution and you do not need to update your hadoop.
… <x-msg://14/#>
Thanks a lot! By the way, is there any document which list all the minimum environment requirements?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub <#38 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AD20y554uMLsIOPiG51y15b38P2hwLZsks5vPONxgaJpZM4bEb8x>.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I read the quick start introduction and it says that you are using a Hadoop 2.9.2 which is a pretty high version. Is there any requirements for Hadoop(HBase Hive…) version? Also is it support Apache Hadoop only? I currently have a cluster running CDH5.7(Hadoop 2.6), need I update my Hadoop or change it to an original Apache Hadoop?
Looking forward to reply, thank you very much!
The text was updated successfully, but these errors were encountered: