Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

compiling the scala source files contained within the mleap R package failed. #26

Open
lgongmsft opened this issue Apr 24, 2019 · 0 comments · May be fixed by #27
Open

compiling the scala source files contained within the mleap R package failed. #26

lgongmsft opened this issue Apr 24, 2019 · 0 comments · May be fixed by #27

Comments

@lgongmsft
Copy link
Contributor

I sync to the latest master and was trying to compile the scala source files contained within the mleap R package with the configure.R script but got the following error:

lgong@ubuntu:~/mleapr$ Rscript configure.R
Warning message:
In normalizePath("internal/mleap-spark") :
  path[1]="internal/mleap-spark": No such file or directory
==> using scalac 2.11.8
==> building against Spark 2.0.0
==> building 'mleap-2.0-2.11.jar' ...
==> '/home/lgong/scala/scala-2.11.8/bin/scalac' -optimise -deprecation '/home/lgong/mleapr/java/spark-2.0.0/main.scala'
/home/lgong/mleapr/java/spark-2.0.0/main.scala:3: error: object mleap is not a member of package org.apache.spark.ml
import org.apache.spark.ml.mleap.SparkUtil
                           ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:4: error: not found: object ml
import ml.combust.mleap.spark.SparkSupport._
       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:5: error: not found: object resource
import resource._
       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:6: error: not found: object ml
import ml.combust.bundle.BundleFile
       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:7: error: object bundle is not a member of package org.apache.spark.ml
import org.apache.spark.ml.bundle.SparkBundleContext
                           ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:16: error: not found: value SparkUtil
    val pipeline = SparkUtil.createPipelineModel(transformers.toArray)
                   ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:17: error: not found: value SparkBundleContext
    implicit val sbc = SparkBundleContext().withDataset(dataset)
                       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:18: error: not found: value managed
    for(bf <- managed(BundleFile("jar:" + path))) {
              ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:18: error: not found: value BundleFile
    for(bf <- managed(BundleFile("jar:" + path))) {
                      ^
9 errors found
Error in spark_compile(jar_name = jar_name, spark_home = spark_home, filter = filter,  :
  ==> failed to compile Scala source files
Calls: <Anonymous> -> spark_compile
Execution halted
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant