When trying to install the rhdfs pacakge, I get an IOException from Java:
demos@workhorse:~$ /usr/bin/R CMD INSTALL rhdfs
Have you ever seen this? How can I fix it?
I am using R 2.14 and CDH3.
Is there someway I can get the full stack trace? The exception seems to be thrown from somewhere in the classes in the JAR file. It seems to be an issue inside the RJavaTools class. It looks like the invokeMethod function is throwing the exception... but I can't figure out how to resolve it.
Hi David :)
Unfortunately, all I've been able to find is that it is a classloader issue in RJavaTools it seems.
I have HADOOP_HOME pointing to the root directory of the hadoop distribution (where the conf, logs etc. directories are) as it the standard operating procedure. I have HADOOP_CONF set to $HADOOP_HOME/conf. rhdfs seems to see that these are both set.
I have hadoop-core-0.20.2-cdh3u1.jar and several other jars in $HADOOP_HOME at the top level: $HADOOP_HOME/hadoop-core-0.20.2-cdh3u1.jar.
Thanks. Yes already did that. But this is interesting...
On my work laptop (Ubuntu 11.10) it works fine using the exact same Hadoop (I copy the Hadoop directory from machine to machine so the config, version, etc. is identical).
This is just a stab in the dark, but it might be an incompatibility with Ubuntu 10.04 (about to be phased out), or the version of Java I have on it. I did not try the command with the -e switch though.
Not Hive, but HBase. Both machines have HBase though. When I get a chance I will check my version of the JRE on both machines. Very strange issue.
any further resolution on this? I have the same issue on debian linux. I do have HIVE installed.
Kind of at a loss of what to try to move forward.
I think rJava and Hive have different requirements for JAVA_HOME. I ran into similar troubles.