Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Spark evaluation NPEs #3970
Under some rare circumstances, Spark evaluation can lead to a NPE here:
I'm not totally sure on the cause here - I suspect this can occur with empty partitions, or fewer partitions than executors (10 objects on 16 workers causes NPE, but 10 objects on 8 workers doesn't). Either way, a defensive null check in that merge function should help.
Hi! I'm experiencing the same issue in evaluation phase using DL4J 0.9.1 with Spark 2.1:
Is there any workaround I could try to make it work? I tried running it with different number of workers with no success.
@GregaVrbancic only thing I can suggest: take the code from here
and adapt it in your project to use the fix from this PR (i.e., the modified IEvaluationReduceFunction) :