Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"main" java.lang.NoSuchMethodError: org.apache.spark.rdd.ShuffledRDD #16

Closed
mindcrusher11 opened this issue Sep 5, 2016 · 4 comments
Closed

Comments

@mindcrusher11
Copy link

whenever i try to run KNN regression using direct main method I am getting this error "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.ShuffledRDD but when I try to run text cases it is working fine.as I read different articles i m getting most of suggestion as different versions but i tried by adding main class in
actual code itself after cloning

@mindcrusher11
Copy link
Author

name := "SparkKnn"

version := "1.0"

scalaVersion := "2.10.5"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.0",
"org.apache.spark" %% "spark-sql" % "1.6.0",
"org.apache.spark" %% "spark-mllib" % "1.6.0",
"io.snappydata" % "snappy-spark-mllib_2.10" % "1.6.2-1",
"saurfang"%"spark-knn"%"0.1.0"
)

spDependencies += "saurfang/spark-knn:0.1.0"

sparkComponents +="mllib"

sparkVersion := "1.6.0"

my sbt configuration

@mindcrusher11
Copy link
Author

object SparkKnn {

def main(args:Array[String]) :Unit={

//val conf = new SparkConf().set("spark.serializer", "org.apache.spark.serializer.KryoSerializer").setMaster("local[4]")
val conf = configure("Popularity")
val sc = new SparkContext(conf)

val sqlContext = new org.apache.spark.sql.SQLContext(sc)

// this is used to implicitly convert an RDD to a DataFrame.
import sqlContext.implicits._
val training = MLUtils.loadLibSVMFile(sc, "/home/gaur/svmdata").toDF().select("features","label")

training.show()

val knn = new KNNRegression().setTopTreeSize(500 / 5).setK(3)

val knnModel = knn.fit(training)

//val predicted = knnModel.transform(training)

//predicted.show()

}

def configure(appName: String = "Sparkling Water Demo"): SparkConf = {
val conf = new SparkConf().setAppName(appName)
conf.setIfMissing("spark.master", sys.env.getOrElse("spark.master", "local[*]"))
conf
}
}

@mindcrusher11
Copy link
Author

till show it is working fine it may be version issue but which version shall I use I checked in project folder 1.6.0 spark-core dependency is being used so i changed it to 1.6.0 but still iam getting same issue

@mindcrusher11
Copy link
Author

with spark submit jar option it is working fine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant