Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark 1.6.x couldn't work with bigdl in the same pom.xml. #399

Closed
qiuxin2012 opened this issue Jan 20, 2017 · 2 comments
Closed

Spark 1.6.x couldn't work with bigdl in the same pom.xml. #399

qiuxin2012 opened this issue Jan 20, 2017 · 2 comments

Comments

@qiuxin2012
Copy link
Contributor

qiuxin2012 commented Jan 20, 2017

Will get below error when running mvn clean package:

[INFO] Compiling 1 source files to /home/xin/qiuxin2012/SampleMlp/target/classes at 1484906632083
[ERROR] error: error while loading <root>, zip file is empty
[ERROR] error: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.
[ERROR]         at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR]         at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[ERROR]         at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[ERROR]         at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[ERROR]         at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[ERROR]         at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[ERROR]         at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
[ERROR]         at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
[ERROR]         at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)
[ERROR]         at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[ERROR]         at scala.tools.nsc.Main$.doCompile(Main.scala:79)
[ERROR]         at scala.tools.nsc.Driver.process(Driver.scala:54)
[ERROR]         at scala.tools.nsc.Driver.main(Driver.scala:67)
[ERROR]         at scala.tools.nsc.Main.main(Main.scala)
[ERROR]         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR]         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR]         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR]         at java.lang.reflect.Method.invoke(Method.java:498)
[ERROR]         at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[ERROR]         at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
[ERROR] 

To reproduce this error message, you can check out project SampleMlp, and change 1.5.1 to 1.6.3

@qiuxin2012 qiuxin2012 changed the title Spark 1.6.x couldn't work with bigdl. Spark 1.6.x couldn't work with bigdl in the same pom.xml. Jan 20, 2017
@jason-dai jason-dai added this to the Sprint-2017-01-31 milestone Jan 22, 2017
@qiuxin2012
Copy link
Contributor Author

qiuxin2012 commented Jan 22, 2017

Change BigDL/pom.xml's spark.version from <spark.version>1.5.1</spark.version> to <spark.version>1.6.3</spark.version> will also get an Error.

[INFO] Compiling 230 Scala sources and 1 Java source to /home/xin/IntelAnalytics/BigDL2/dl/target/classes...
[ERROR] error while loading <root>, zip file is empty
[INFO] ------------------------------------------------------------------------

@qiuxin2012
Copy link
Contributor Author

After more test on this issue, the final problem is the bad central mirror in my maven conf. It's OK after change to Aliyun central and clean my ~/.m2/repository.

Le-Zheng pushed a commit to Le-Zheng/BigDL that referenced this issue Oct 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants