Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark breaks down when running points.show() #74

Open
momir81 opened this issue Aug 23, 2016 · 3 comments
Open

Spark breaks down when running points.show() #74

momir81 opened this issue Aug 23, 2016 · 3 comments

Comments

@momir81
Copy link

momir81 commented Aug 23, 2016

Hello,
I am trying to run test command points.show() like in the introduction page but then spark stops working and exits spark-shell.
Below is the whole log:
**16/08/23 13:26:20 INFO spark.SparkContext: Starting job: show at :37
16/08/23 13:26:20 INFO scheduler.DAGScheduler: Got job 0 (show at :37) with 1 output partitions
16/08/23 13:26:20 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (show at :37)
16/08/23 13:26:20 INFO scheduler.DAGScheduler: Parents of final stage: List()
16/08/23 13:26:20 INFO scheduler.DAGScheduler: Missing parents: List()
16/08/23 13:26:20 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[4] at show at :37), which has no missing parents
16/08/23 13:26:20 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 5.5 KB, free 5.5 KB)
16/08/23 13:26:20 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 3.0 KB, free 8.6 KB)
16/08/23 13:26:20 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:52704 (size: 3.0 KB, free: 517.4 MB)
16/08/23 13:26:20 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
16/08/23 13:26:20 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[4] at show at :37)
16/08/23 13:26:20 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/08/23 13:26:20 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 2634 bytes)
16/08/23 13:26:20 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
16/08/23 13:26:20 INFO executor.Executor: Fetching http://172.31.42.1:56863/jars/org.codehaus.jackson_jackson-core-asl-1.9.12.jar with timestamp 1471958708517
16/08/23 13:26:20 INFO util.Utils: Fetching http://172.31.42.1:56863/jars/org.codehaus.jackson_jackson-core-asl-1.9.12.jar to /tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/fetchFileTemp2531640319641615074.tmp
16/08/23 13:26:21 INFO executor.Executor: Adding file:/tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/org.codehaus.jackson_jackson-core-asl-1.9.12.jar to class loader
16/08/23 13:26:21 INFO executor.Executor: Fetching http://172.31.42.1:56863/jars/com.esri.geometry_esri-geometry-api-1.2.1.jar with timestamp 1471958708510
16/08/23 13:26:21 INFO util.Utils: Fetching http://172.31.42.1:56863/jars/com.esri.geometry_esri-geometry-api-1.2.1.jar to /tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/fetchFileTemp7329359299396583743.tmp
16/08/23 13:26:21 INFO executor.Executor: Adding file:/tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/com.esri.geometry_esri-geometry-api-1.2.1.jar to class loader
16/08/23 13:26:21 INFO executor.Executor: Fetching http://172.31.42.1:56863/jars/commons-io_commons-io-2.4.jar with timestamp 1471958708509
16/08/23 13:26:21 INFO util.Utils: Fetching http://172.31.42.1:56863/jars/commons-io_commons-io-2.4.jar to /tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/fetchFileTemp5867381645425744231.tmp
16/08/23 13:26:21 INFO executor.Executor: Adding file:/tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/commons-io_commons-io-2.4.jar to class loader
16/08/23 13:26:21 INFO executor.Executor: Fetching http://172.31.42.1:56863/jars/harsha2010_magellan-1.0.3-s_2.10.jar with timestamp 1471958708508
16/08/23 13:26:21 INFO util.Utils: Fetching http://172.31.42.1:56863/jars/harsha2010_magellan-1.0.3-s_2.10.jar to /tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/fetchFileTemp2704526544117072793.tmp
16/08/23 13:26:21 INFO executor.Executor: Adding file:/tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/harsha2010_magellan-1.0.3-s_2.10.jar to class loader
16/08/23 13:26:21 INFO executor.Executor: Fetching http://172.31.42.1:56863/jars/org.json_json-20090211.jar with timestamp 1471958708516
16/08/23 13:26:21 INFO util.Utils: Fetching http://172.31.42.1:56863/jars/org.json_json-20090211.jar to /tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/fetchFileTemp8186701820175920799.tmp
16/08/23 13:26:21 INFO executor.Executor: Adding file:/tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/userFiles-b35893ec-53c1-4d25-b0f8-c1fd86430e74/org.json_json-20090211.jar to class loader
16/08/23 13:26:21 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.AbstractMethodError: org.apache.spark.sql.catalyst.expressions.Expression.genCode(Lorg/apache/spark/sql/catalyst/expressions/codegen/CodeGenContext;Lorg/apache/spark/sql/catalyst/expressions/codegen/GeneratedExpressionCode;)Ljava/lang/String;
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$gen$2.apply(Expression.scala:104)
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$gen$2.apply(Expression.scala:100)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.catalyst.expressions.Expression.gen(Expression.scala:100)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext$$anonfun$generateExpressions$1.apply(CodeGenerator.scala:459)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext$$anonfun$generateExpressions$1.apply(CodeGenerator.scala:459)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext.generateExpressions(CodeGenerator.scala:459)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.createCode(GenerateUnsafeProjection.scala:281)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:324)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.generate(GenerateUnsafeProjection.scala:313)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:151)
at org.apache.spark.sql.execution.Project$$anonfun$1.apply(basicOperators.scala:47)
at org.apache.spark.sql.execution.Project$$anonfun$1.apply(basicOperators.scala:46)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
16/08/23 13:26:21 ERROR util.SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main]
java.lang.AbstractMethodError: org.apache.spark.sql.catalyst.expressions.Expression.genCode(Lorg/apache/spark/sql/catalyst/expressions/codegen/CodeGenContext;Lorg/apache/spark/sql/catalyst/expressions/codegen/GeneratedExpressionCode;)Ljava/lang/String;
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$gen$2.apply(Expression.scala:104)
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$gen$2.apply(Expression.scala:100)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.catalyst.expressions.Expression.gen(Expression.scala:100)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext$$anonfun$generateExpressions$1.apply(CodeGenerator.scala:459)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext$$anonfun$generateExpressions$1.apply(CodeGenerator.scala:459)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext.generateExpressions(CodeGenerator.scala:459)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.createCode(GenerateUnsafeProjection.scala:281)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:324)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.generate(GenerateUnsafeProjection.scala:313)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:151)
at org.apache.spark.sql.execution.Project$$anonfun$1.apply(basicOperators.scala:47)
at org.apache.spark.sql.execution.Project$$anonfun$1.apply(basicOperators.scala:46)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
16/08/23 13:26:21 INFO spark.SparkContext: Invoking stop() from shutdown hook
16/08/23 13:26:21 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.AbstractMethodError: org.apache.spark.sql.catalyst.expressions.Expression.genCode(Lorg/apache/spark/sql/catalyst/expressions/codegen/CodeGenContext;Lorg/apache/spark/sql/catalyst/expressions/codegen/GeneratedExpressionCode;)Ljava/lang/String;
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$gen$2.apply(Expression.scala:104)
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$gen$2.apply(Expression.scala:100)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.catalyst.expressions.Expression.gen(Expression.scala:100)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext$$anonfun$generateExpressions$1.apply(CodeGenerator.scala:459)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext$$anonfun$generateExpressions$1.apply(CodeGenerator.scala:459)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext.generateExpressions(CodeGenerator.scala:459)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.createCode(GenerateUnsafeProjection.scala:281)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:324)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.generate(GenerateUnsafeProjection.scala:313)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:151)
at org.apache.spark.sql.execution.Project$$anonfun$1.apply(basicOperators.scala:47)
at org.apache.spark.sql.execution.Project$$anonfun$1.apply(basicOperators.scala:46)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/08/23 13:26:21 ERROR scheduler.TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/08/23 13:26:21 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/08/23 13:26:21 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/08/23 13:26:21 INFO scheduler.TaskSchedulerImpl: Cancelling stage 0
16/08/23 13:26:21 INFO scheduler.DAGScheduler: ResultStage 0 (show at :37) failed in 0.763 s
16/08/23 13:26:21 INFO scheduler.DAGScheduler: Job 0 failed: show at :37, took 0.929117 s
16/08/23 13:26:21 INFO ui.SparkUI: Stopped Spark web UI at http://172.31.42.1:4040
16/08/23 13:26:21 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/08/23 13:26:21 INFO storage.MemoryStore: MemoryStore cleared
16/08/23 13:26:21 INFO storage.BlockManager: BlockManager stopped
16/08/23 13:26:21 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/08/23 13:26:21 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/08/23 13:26:21 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/08/23 13:26:21 INFO spark.SparkContext: Successfully stopped SparkContext
16/08/23 13:26:21 INFO util.ShutdownHookManager: Shutdown hook called
16/08/23 13:26:21 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-ffcbe845-e900-4c61-99a8-e90314b2c77c
16/08/23 13:26:21 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/08/23 13:26:21 ERROR util.ShutdownHookManager: Exception while deleting Spark temp dir: /tmp/spark-ffcbe845-e900-4c61-99a8-e90314b2c77c
java.io.IOException: Failed to delete: /tmp/spark-ffcbe845-e900-4c61-99a8-e90314b2c77c
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:928)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1765)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
16/08/23 13:26:21 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9
16/08/23 13:26:21 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-794081a7-8d13-48ba-ae14-105ca6e41fd6
16/08/23 13:26:21 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-4c5e63ff-09dc-4f32-afec-350e0c244bb9/httpd-adbe66d4-9643-4729-b9cb-6325cd5bb25a
16/08/23 13:26:21 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.**

Can someone please give me a hint what could be a problem?

Thanks!

@jawedwards
Copy link

I also get this, is there any insights that can be shed on this issue?

@edanshalom
Copy link

I see the same issue here when trying to show() points in DF.
When using an RDD there's no problem. Maybe you should try use the RDD and export the results when you're done. I'm very new to Magellan so I don't know if using only the RDD is sufficient.

Saw some people reporting issues with this specific release of Magellan (1.0.3-s_2.10) .
When I tried older releases of Magellan I got a different Exception: java.lang.AbstractMethodError.
Here is my log:
Any ideas? thanks.

Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 2.0.0
/_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import magellan.{Point, Polygon}
import magellan.{Point, Polygon}

scala> import org.apache.spark.sql.magellan.dsl.expressions._
import org.apache.spark.sql.magellan.dsl.expressions._

scala> import org.apache.spark.sql.types._
import org.apache.spark.sql.types._

scala> val points = sc.parallelize(Seq((-1.0, -1.0), (-1.0, 1.0), (1.0, -1.0))).toDF("x", "y").select(point($"x", $"y").as("point"))
points: org.apache.spark.sql.DataFrame = [point: point]

scala>

scala> points.show()
java.lang.AbstractMethodError
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$genCode$2.apply(Expression.scala:104)
at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$genCode$2.apply(Expression.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.catalyst.expressions.Expression.genCode(Expression.scala:101)
at org.apache.spark.sql.execution.ProjectExec$$anonfun$5.apply(basicPhysicalOperators.scala:57)
at org.apache.spark.sql.execution.ProjectExec$$anonfun$5.apply(basicPhysicalOperators.scala:57)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.spark.sql.execution.ProjectExec.doConsume(basicPhysicalOperators.scala:57)
at org.apache.spark.sql.execution.CodegenSupport$class.consume(WholeStageCodegenExec.scala:153)
at org.apache.spark.sql.execution.InputAdapter.consume(WholeStageCodegenExec.scala:218)
at org.apache.spark.sql.execution.InputAdapter.doProduce(WholeStageCodegenExec.scala:244)
at org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:83)
at org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:78)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
at org.apache.spark.sql.execution.CodegenSupport$class.produce(WholeStageCodegenExec.scala:78)
at org.apache.spark.sql.execution.InputAdapter.produce(WholeStageCodegenExec.scala:218)
at org.apache.spark.sql.execution.ProjectExec.doProduce(basicPhysicalOperators.scala:40)
at org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:83)
at org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:78)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
at org.apache.spark.sql.execution.CodegenSupport$class.produce(WholeStageCodegenExec.scala:78)
at org.apache.spark.sql.execution.ProjectExec.produce(basicPhysicalOperators.scala:30)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doCodeGen(WholeStageCodegenExec.scala:309)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:347)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:114)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:240)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:323)
at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:39)
at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2183)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2532)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2182)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2189)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1925)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1924)
at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2562)
at org.apache.spark.sql.Dataset.head(Dataset.scala:1924)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2139)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:239)
at org.apache.spark.sql.Dataset.show(Dataset.scala:526)
at org.apache.spark.sql.Dataset.show(Dataset.scala:486)
at org.apache.spark.sql.Dataset.show(Dataset.scala:495)
... 52 elided

@edanshalom
Copy link

Using Spark 1.4 solved the issue for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants