Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

While submitting Java jobs to Spark Jobserver, it shows JOB Loading Error #1387

Open
rayabhisek22 opened this issue Oct 8, 2021 · 1 comment

Comments

@rayabhisek22
Copy link

rayabhisek22 commented Oct 8, 2021

Used Spark version
2.4.7
Used Spark Job Server version
0.8.0 (release version)
Deployed mode
EMR

Actual (wrong) behavior
While submitting Java jobs to Spark Jobserver, it shows JOB Loading Error
Steps to reproduce

Logs
"status": "JOB LOADING FAILED",
"result": {
"message": "com.sample.wordcount.SparkJavaJob cannot be cast to spark.jobser ver.api.SparkJobBase",
"errorClass": "java.lang.ClassCastException",

My code
package com.sample.wordcount;

import com.typesafe.config.Config;
import com.typesafe.config.ConfigFactory;
import org.apache.spark.api.java.JavaSparkContext;
import spark.jobserver.japi.JSparkJob;
import spark.jobserver.api.JobEnvironment;

public class SparkJavaJob implements JSparkJob {

@Override
public Object run(Object sc, JobEnvironment runtime, Config data) {
    return "OK";
}

@Override
public Config verify(Object sc, JobEnvironment runtime, Config config) {
    return ConfigFactory.empty();
}

}

@murraytodd
Copy link

I'm seeing a similar issue, but I'm using the most current SJS version 0.11.1.
I'm noticing that if I go to https://sparkjobserver.jfrog.io/ui/packages I see that there are only some packages for 0.11.0 compiled against 2.12.

Is there any chance that a bunch of our SJS releases got dropped from the JFrog resolver?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants