New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hadoop's uber mode causes job failure #280
Comments
@AndreasHoermandinger Thanks - what version of es-hadoop and Pig are you using? Hadoop is 2.5.1, correct? |
Yes, correct |
@AndreasHoermandinger Hi, I've pushed a dev build with a potential fix to maven (check out the latest 2.0.2.BUILD-SNAPSHOT). Can you please try it out and report back? |
@costin Yes, it works. Thank you! |
Hadoop 2.5.x introduced a bug where the task attempt is used for the task id. To cope around this, the code searches first for the task attempt and only then falls back to the task id. relates #280
@costin I tried writing too using the fixed build, and I get the same error again
|
@AndreasHoermandinger Can you post the entire stacktrace please? You mentioned the fix worked - can you indicate whether it was with reading or writing? |
@costin Sorry, forgot to mention: it worked with reading ( https://gist.github.com/AndreasHoermandinger/5a55f356a1c8480fb6e6 |
@AndreasHoermandinger Looks like there a code path that wasn't addressed by the previous fix. I've remedied this and pushed another 2.0.2 build - can you please try it out and report back? |
@costin Now writing works too, thank you for the fast fixes |
cheers! |
Marking as close |
When having
mapreduce.job.ubertask.enable
set totrue
applications, that are executed in uber mode, fail.There seems to be a problem with the task id:
Here are the full logs of the crashes in mapreduce and pig
https://gist.github.com/AndreasHoermandinger/a89ec7df4334fa2c98e0
As data I used the shakespeare.json file provided in the 10 minute kibana walkthrough: http://www.elasticsearch.org/guide/en/kibana/current/snippets/shakespeare.json
The text was updated successfully, but these errors were encountered: