New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
rmr2 issue with hortonworks hadoop release 1.0.3.14 #144
Comments
This is already fixed in 2.0.1 and seems to be of little consequence. A On Fri, Oct 19, 2012 at 4:34 PM, meggenb notifications@github.com wrote:
|
Thank you Antonio, we'll be using the current release until we get the 2.0.1 package. -martin eggenberger |
I am currently running the rhadoop sample on a single node cluster on hortonworks release 1.0.3.14.
Both the to.dfs and mapreduce method work well, but the from.dfs method throws an execpetion.
R Sample:
groups = rbinom(32, n = 50, prob = 0.4)
groups = to.dfs(groups)
result = mapreduce(input = groups, map = function(k,v) keyval(v, 1), reduce = function(k,vv) keyval(k, length(vv)))
from.dfs(result)
from.dfs( result );
Exception in thread "main" java.io.FileNotFoundException: File does not exist: 3
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:528)
at org.apache.hadoop.streaming.DumpTypedBytes.run(DumpTypedBytes.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.streaming.HadoopStreaming.main(HadoopStreaming.java:41)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
$key
[1] 7 8 9 10 11 12 13 14 15 16 17 18
$val
[1] 1 3 2 4 4 12 6 5 7 2 3 1
This maybe related to having the native compression on hadoop working.
Thank you
-martin eggenberger
The text was updated successfully, but these errors were encountered: