Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

rhdfs fails after Hadoop upgrade #95

Closed
st0ut717 opened this Issue · 5 comments

3 participants

@st0ut717

Hi,

I had rhdfs and rmr working with hadoop 0.20.203. however after reading that RHadoop requireed 1.0.2 I upgraded my cluster and rhdfs is getting the following Error.

library(rhdfs)
Loading required package: rJava

HADOOP_HOME=/usr/sbin/
HADOOP_CONF=/etc/hadoop/
Error : .onLoad failed in loadNamespace() for 'rhdfs', details:
call: .jnew("org/apache/hadoop/conf/Configuration")
error: java.lang.ClassNotFound Exception
Error: Package/namespace load failed for 'rhdfs'

Hadoop Cluster is up. PI Estimator and wordcount are operational.
R CMD javareconf has no change in indications.

@piccolbo
Owner
@piccolbo
Owner

Did you really install hadoop under /usr/sbin? Could you enlighten us as to the rationale of such an innovative directory layout? (read: that can not possibly be correct)

@st0ut717

Thank you Antonio however the 1.0.2 install was executed via RPM, therefore the paths were specified during the rpm -ivf command. With word count and PI Estimator working with absolute paths specified in the ~bin/hadoop jar $PATH_TO_JARS foo foo foo it means that Hadoop is functional.

If the config has to change based on rmr and rhdfs requirements thats fine. I am still in the prototype phase of my project.

@piccolbo
Owner
@piccolbo piccolbo closed this
@mindcrusher11

I am also facing same issue
sh: 1: /usr/local/hadoop/bin: Permission denied
Error in .jnew("org/apache/hadoop/conf/Configuration") :
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.conf.Configuration
In addition: Warning message:
running command '/usr/local/hadoop/bin classpath' had status 126

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.