You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I tried a simple example to import spark.implicits._ I get the following error while sbt build:
My code:
importio.hydrosphere.mist.lib.{MistJob, SQLSupport}
importorg.apache.spark.sql._objectSimpleContextextendsMistJobwithSQLSupport{
/** Contains implementation of spark job with ordinary [[org.apache.spark.SparkContext]] * Abstract method must be overridden * * based on https://github.com/Hydrospheredata/mist/blob/master/examples/src/main/scala/SimpleContext.scala * * @paramdigits user parameter digits * @return result of the job*/importsession.implicits._defexecute(digits:Seq[Int]):Map[String, Any] = {
valmydf= digits.toSeq.toDF("number")
Map("result"-> mydf.map(x => x.getInt(0) *3).collect())
}
}
Error:
[info] Loading project definition from /home/user/mist/apache-spark-restAPI-example/project
[info] Set current project to sparkMist (in build file:/home/user/mist/apache-spark-restAPI-example/)
[info] Compiling 1 Scala source to /home/user/mist/apache-spark-restAPI-example/target/scala-2.11/classes...
[error] /home/user/mist/apache-spark-restAPI-example/src/main/scala/SimpleContext.scala:13: stable identifier required, but SimpleContext.this.session.implicits found.
[error] import session.implicits._
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
Please help me how I can import spark.implicits._
Thanks..!!
The text was updated successfully, but these errors were encountered:
Currently inside MistJob session is defined as a function, but should be a value. We will fix it later.
As a workaround I can suggest to assign session to some variable and then call implicits import from it.
Spark Version: 2.0.2
Scala Version: 2.11.8
When I tried a simple example to
import spark.implicits._
I get the following error while sbt build:My code:
Error:
Please help me how I can
import spark.implicits._
Thanks..!!
The text was updated successfully, but these errors were encountered: