Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

rhdfs fails after Hadoop upgrade #95

Closed
st0ut717 opened this Issue May 7, 2012 · 5 comments

Comments

Projects
None yet
3 participants

st0ut717 commented May 7, 2012

Hi,

I had rhdfs and rmr working with hadoop 0.20.203. however after reading that RHadoop requireed 1.0.2 I upgraded my cluster and rhdfs is getting the following Error.

library(rhdfs)
Loading required package: rJava

HADOOP_HOME=/usr/sbin/
HADOOP_CONF=/etc/hadoop/
Error : .onLoad failed in loadNamespace() for 'rhdfs', details:
call: .jnew("org/apache/hadoop/conf/Configuration")
error: java.lang.ClassNotFound Exception
Error: Package/namespace load failed for 'rhdfs'

Hadoop Cluster is up. PI Estimator and wordcount are operational.
R CMD javareconf has no change in indications.

Collaborator

piccolbo commented May 7, 2012

Hi,
have you checked issues #79, #63 and #22 to see if any of that information
applies here? Thanks

Antonio

On Mon, May 7, 2012 at 9:54 AM, st0ut717 <
reply@reply.github.com

wrote:

Hi,

I had rhdfs and rmr working with hadoop 0.20.203. however after reading
that RHadoop requireed 1.0.2 I upgraded my cluster and rhdfs is getting the
following Error.

library(rhdfs)
Loading required package: rJava

HADOOP_HOME=/usr/sbin/
HADOOP_CONF=/etc/hadoop/
Error : .onLoad failed in loadNamespace() for 'rhdfs', details:
call: .jnew("org/apache/hadoop/conf/Configuration")
error: java.lang.ClassNotFound Exception
Error: Package/namespace load failed for 'rhdfs'

Hadoop Cluster is up. PI Estimator and wordcount are operational.
R CMD javareconf has no change in indications.


Reply to this email directly or view it on GitHub:
#95

Collaborator

piccolbo commented May 7, 2012

Did you really install hadoop under /usr/sbin? Could you enlighten us as to the rationale of such an innovative directory layout? (read: that can not possibly be correct)

st0ut717 commented May 7, 2012

Thank you Antonio however the 1.0.2 install was executed via RPM, therefore the paths were specified during the rpm -ivf command. With word count and PI Estimator working with absolute paths specified in the ~bin/hadoop jar $PATH_TO_JARS foo foo foo it means that Hadoop is functional.

If the config has to change based on rmr and rhdfs requirements thats fine. I am still in the prototype phase of my project.

Collaborator

piccolbo commented May 8, 2012

Can you find src/core/org/apache/hadoop/conf/Configuration.java under
HADOOP_HOME? If not, that's not the true hadoop home. I suspect you set as
HADOOP_HOME to the path to the hadoop executable. It's two different
things. Or it could be an incomplete installation, I am not sure yet. We
need to find that class first, then point rhdfs to the right spot and then
it should work.

On Mon, May 7, 2012 at 4:55 PM, st0ut717 <
reply@reply.github.com

wrote:

Thank you Antonio however the 1.0.2 install was executed via RPM,
therefore the paths were specified during the rpm -ivf command. With word
count and PI Estimator working with absolute paths specified in the
~bin/hadoop jar $PATH_TO_JARS foo foo foo it means that Hadoop is
functional.

If the config has to change based on rmr and rhdfs requirements thats
fine. I am still in the prototype phase of my project.


Reply to this email directly or view it on GitHub:

#95 (comment)

@piccolbo piccolbo closed this Jul 18, 2012

I am also facing same issue
sh: 1: /usr/local/hadoop/bin: Permission denied
Error in .jnew("org/apache/hadoop/conf/Configuration") :
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.conf.Configuration
In addition: Warning message:
running command '/usr/local/hadoop/bin classpath' had status 126

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment