You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I sync to the latest master and was trying to compile the scala source files contained within the mleap R package with the configure.R script but got the following error:
lgong@ubuntu:~/mleapr$ Rscript configure.R
Warning message:
In normalizePath("internal/mleap-spark") :
path[1]="internal/mleap-spark": No such file or directory
==> using scalac 2.11.8
==> building against Spark 2.0.0
==> building 'mleap-2.0-2.11.jar' ...
==> '/home/lgong/scala/scala-2.11.8/bin/scalac' -optimise -deprecation '/home/lgong/mleapr/java/spark-2.0.0/main.scala'
/home/lgong/mleapr/java/spark-2.0.0/main.scala:3: error: object mleap is not a member of package org.apache.spark.ml
import org.apache.spark.ml.mleap.SparkUtil
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:4: error: not found: object ml
import ml.combust.mleap.spark.SparkSupport._
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:5: error: not found: object resource
import resource._
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:6: error: not found: object ml
import ml.combust.bundle.BundleFile
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:7: error: object bundle is not a member of package org.apache.spark.ml
import org.apache.spark.ml.bundle.SparkBundleContext
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:16: error: not found: value SparkUtil
val pipeline = SparkUtil.createPipelineModel(transformers.toArray)
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:17: error: not found: value SparkBundleContext
implicit val sbc = SparkBundleContext().withDataset(dataset)
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:18: error: not found: value managed
for(bf <- managed(BundleFile("jar:" + path))) {
^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:18: error: not found: value BundleFile
for(bf <- managed(BundleFile("jar:" + path))) {
^
9 errors found
Error in spark_compile(jar_name = jar_name, spark_home = spark_home, filter = filter, :
==> failed to compile Scala source files
Calls: <Anonymous> -> spark_compile
Execution halted
The text was updated successfully, but these errors were encountered:
I sync to the latest master and was trying to compile the scala source files contained within the mleap R package with the configure.R script but got the following error:
The text was updated successfully, but these errors were encountered: