Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MapReduce cannot read Swift Object #2

Open
smith1511 opened this issue Aug 22, 2013 · 0 comments
Open

MapReduce cannot read Swift Object #2

smith1511 opened this issue Aug 22, 2013 · 0 comments

Comments

@smith1511
Copy link

When executing the wordcount MapReduce job I get an exception (below) from the Swift FS component about the connection being closed.

I've verified that I can cat the file,

hadoop fs -cat swift://mycontainer.hp/wordcount.txt20130813015900000

This better work!!

Any idea what I might be missing?

Thanks,
Christian

root@namenode:/opt/hadoop# ./bin/hadoop jar hadoop-examples-1.2.0.jar wordcount swift://mycontainer.hp/wordcount.txt20130813015900000 swift://mycontainer.hp/out-01

13/08/22 12:36:05 INFO input.FileInputFormat: Total input paths to process : 1
13/08/22 12:36:07 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/08/22 12:36:07 WARN snappy.LoadSnappy: Snappy native library not loaded
13/08/22 12:36:07 INFO mapred.JobClient: Running job: job_201308221224_0002
13/08/22 12:36:08 INFO mapred.JobClient: map 0% reduce 0%
13/08/22 12:36:34 INFO mapred.JobClient: Task Id : attempt_201308221224_0002_m_000000_0, Status : FAILED
java.io.IOException: Operation failed as connection is closed to https://region-a.geo-1.objects.hpcloudsvc.com/v1//mycontainer/wordcount.txt20130813015900000
at org.apache.hadoop.fs.swift.http.HttpInputStreamWithRelease.assumeNotReleased(HttpInputStreamWithRelease.java:146)
at org.apache.hadoop.fs.swift.http.HttpInputStreamWithRelease.read(HttpInputStreamWithRelease.java:180)
at org.apache.hadoop.fs.swift.snative.SwiftNativeInputStream.read(SwiftNativeInputStream.java:146)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
at java.io.DataInputStream.read(DataInputStream.java:83)
at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:205)
at org.apache.hadoop.util.LineReader.readLine(LineReader.java:169)
at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:139)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:531)
at org.apache.hadoop.mapreduce.MapContext.nextKeyValue(MapContext.java:67)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant