Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 17/11/30 09:42:00 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0 17/11/30 09:42:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/11/30 09:42:06 INFO FileInputFormat: Total input paths to process : 1 17/11/30 09:42:07 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 17/11/30 09:42:07 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 17/11/30 09:42:07 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 17/11/30 09:42:07 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 17/11/30 09:42:07 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id 17/11/30 09:42:07 INFO FileInputFormat: Total input paths to process : 1 [Stage 3:> (0 + 4) / 20] [Stage 3:==> (1 + 4) / 20] [Stage 3:===========> (4 + 4) / 20] [Stage 3:=======================> (8 + 4) / 20] [Stage 3:===============================> (11 + 4) / 20] [Stage 3:==================================> (12 + 4) / 20] [Stage 3:=============================================> (16 + 4) / 20] [Stage 3:================================================> (17 + 3) / 20] [Stage 4:> (0 + 4) / 20] [Stage 4:===========> (4 + 4) / 20] [Stage 4:=======================> (8 + 4) / 20] [Stage 4:==================================> (12 + 4) / 20] [Stage 4:=============================================> (16 + 4) / 20] [Stage 4:================================================> (17 + 3) / 20] [Stage 6:> (0 + 4) / 20] [Stage 6:==> (1 + 4) / 20] [Stage 6:===========> (4 + 4) / 20] [Stage 6:=======================> (8 + 4) / 20] [Stage 6:==================================> (12 + 4) / 20] [Stage 6:==========================================> (15 + 4) / 20] [Stage 6:=============================================> (16 + 4) / 20] [Stage 6:================================================> (17 + 3) / 20] [Stage 8:> (0 + 4) / 21] [Stage 8:===========> (4 + 4) / 21] [Stage 8:======================> (8 + 4) / 21] [Stage 8:================================> (12 + 4) / 21] [Stage 8:===========================================> (16 + 4) / 21] [Stage 8:================================================> (18 + 3) / 21] [Stage 9:> (0 + 4) / 4]17/11/30 09:42:35 WARN MemoryStore: Not enough space to cache rdd_23_1 in memory! (computed 64.8 MB so far) 17/11/30 09:42:35 WARN BlockManager: Block rdd_23_1 could not be removed as it was not found on disk or in memory 17/11/30 09:42:35 WARN BlockManager: Putting block rdd_23_1 failed 17/11/30 09:42:37 WARN MemoryStore: Not enough space to cache rdd_23_2 in memory! (computed 99.2 MB so far) 17/11/30 09:42:37 WARN BlockManager: Block rdd_23_2 could not be removed as it was not found on disk or in memory 17/11/30 09:42:37 WARN BlockManager: Putting block rdd_23_2 failed 17/11/30 09:42:37 WARN MemoryStore: Not enough space to cache rdd_23_0 in memory! (computed 99.2 MB so far) 17/11/30 09:42:37 WARN BlockManager: Block rdd_23_0 could not be removed as it was not found on disk or in memory 17/11/30 09:42:37 WARN BlockManager: Putting block rdd_23_0 failed [Stage 10:> (0 + 4) / 20] [Stage 10:===========> (4 + 4) / 20] [Stage 10:===================> (7 + 4) / 20] [Stage 10:======================> (8 + 4) / 20] [Stage 10:=================================> (12 + 4) / 20] [Stage 10:============================================> (16 + 4) / 20] [Stage 11:> (0 + 4) / 4] [Stage 12:> (0 + 4) / 20] [Stage 12:===========> (4 + 4) / 20] [Stage 12:======================> (8 + 4) / 20] [Stage 12:=========================> (9 + 4) / 20] [Stage 12:=================================> (12 + 4) / 20] [Stage 12:====================================> (13 + 4) / 20] [Stage 12:============================================> (16 + 4) / 20] [Stage 12:==================================================> (18 + 2) / 20] [Stage 14:> (0 + 4) / 4]17/11/30 09:42:54 WARN MemoryStore: Not enough space to cache rdd_23_2 in memory! (computed 64.8 MB so far) 17/11/30 09:42:54 WARN BlockManager: Block rdd_23_2 could not be removed as it was not found on disk or in memory 17/11/30 09:42:54 WARN BlockManager: Putting block rdd_23_2 failed 17/11/30 09:42:56 WARN MemoryStore: Not enough space to cache rdd_23_3 in memory! (computed 99.2 MB so far) 17/11/30 09:42:56 WARN BlockManager: Block rdd_23_3 could not be removed as it was not found on disk or in memory 17/11/30 09:42:56 WARN BlockManager: Putting block rdd_23_3 failed 17/11/30 09:42:56 WARN MemoryStore: Not enough space to cache rdd_23_1 in memory! (computed 99.2 MB so far) 17/11/30 09:42:56 WARN BlockManager: Block rdd_23_1 could not be removed as it was not found on disk or in memory 17/11/30 09:42:56 WARN BlockManager: Putting block rdd_23_1 failed [Stage 14:===========================================> (3 + 1) / 4] [Stage 15:> (0 + 4) / 4] [Stage 15:=============================> (2 + 2) / 4] [Stage 15:===========================================> (3 + 1) / 4] [Stage 16:> (0 + 4) / 4]17/11/30 09:43:05 WARN BlockManager: Putting block rdd_38_0 failed due to an exception 17/11/30 09:43:05 WARN BlockManager: Block rdd_38_0 could not be removed as it was not found on disk or in memory 17/11/30 09:43:05 WARN BlockManager: Putting block rdd_39_0 failed due to an exception 17/11/30 09:43:05 WARN BlockManager: Block rdd_39_0 could not be removed as it was not found on disk or in memory 17/11/30 09:43:05 ERROR Executor: Exception in task 0.0 in stage 16.0 (TID 181) java.lang.OutOfMemoryError: Java heap space at scala.collection.mutable.ArrayBuilder$ofInt.mkArray(ArrayBuilder.scala:323) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:368) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:316) at org.apache.spark.ml.recommendation.ALS$UncompressedInBlockBuilder.build(ALS.scala:1007) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1217) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1211) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:216) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 17/11/30 09:43:05 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-3,5,main] java.lang.OutOfMemoryError: Java heap space at scala.collection.mutable.ArrayBuilder$ofInt.mkArray(ArrayBuilder.scala:323) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:368) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:316) at org.apache.spark.ml.recommendation.ALS$UncompressedInBlockBuilder.build(ALS.scala:1007) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1217) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1211) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:216) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 17/11/30 09:43:05 WARN TaskSetManager: Lost task 0.0 in stage 16.0 (TID 181, localhost, executor driver): java.lang.OutOfMemoryError: Java heap space at scala.collection.mutable.ArrayBuilder$ofInt.mkArray(ArrayBuilder.scala:323) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:368) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:316) at org.apache.spark.ml.recommendation.ALS$UncompressedInBlockBuilder.build(ALS.scala:1007) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1217) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1211) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:216) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 17/11/30 09:43:05 ERROR TaskSetManager: Task 0 in stage 16.0 failed 1 times; aborting job Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 16.0 failed 1 times, most recent failure: Lost task 0.0 in stage 16.0 (TID 181, localhost, executor driver): java.lang.OutOfMemoryError: Java heap space at scala.collection.mutable.ArrayBuilder$ofInt.mkArray(ArrayBuilder.scala:323) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:368) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:316) at org.apache.spark.ml.recommendation.ALS$UncompressedInBlockBuilder.build(ALS.scala:1007) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1217) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1211) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:216) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) at scala.Option.foreach(Option.scala:257) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958) at org.apache.spark.rdd.RDD.count(RDD.scala:1157) at org.apache.spark.ml.recommendation.ALS$.train(ALS.scala:694) at org.apache.spark.mllib.recommendation.ALS.run(ALS.scala:253) at org.apache.spark.mllib.recommendation.ALS$.train(ALS.scala:340) at org.apache.spark.mllib.recommendation.ALS$.train(ALS.scala:357) at MovieLensALS$$anonfun$main$1$$anonfun$apply$mcVI$sp$1$$anonfun$apply$mcVD$sp$1.apply$mcVI$sp(MovieLensALS.scala:101) at MovieLensALS$$anonfun$main$1$$anonfun$apply$mcVI$sp$1$$anonfun$apply$mcVD$sp$1.apply(MovieLensALS.scala:100) at MovieLensALS$$anonfun$main$1$$anonfun$apply$mcVI$sp$1$$anonfun$apply$mcVD$sp$1.apply(MovieLensALS.scala:100) at scala.collection.immutable.List.foreach(List.scala:381) at MovieLensALS$$anonfun$main$1$$anonfun$apply$mcVI$sp$1.apply$mcVD$sp(MovieLensALS.scala:100) at MovieLensALS$$anonfun$main$1$$anonfun$apply$mcVI$sp$1.apply(MovieLensALS.scala:100) at MovieLensALS$$anonfun$main$1$$anonfun$apply$mcVI$sp$1.apply(MovieLensALS.scala:100) at scala.collection.immutable.List.foreach(List.scala:381) at MovieLensALS$$anonfun$main$1.apply$mcVI$sp(MovieLensALS.scala:100) at MovieLensALS$$anonfun$main$1.apply(MovieLensALS.scala:100) at MovieLensALS$$anonfun$main$1.apply(MovieLensALS.scala:100) at scala.collection.immutable.List.foreach(List.scala:381) at MovieLensALS$.main(MovieLensALS.scala:100) at MovieLensALS.main(MovieLensALS.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.OutOfMemoryError: Java heap space at scala.collection.mutable.ArrayBuilder$ofInt.mkArray(ArrayBuilder.scala:323) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:368) at scala.collection.mutable.ArrayBuilder$ofInt.result(ArrayBuilder.scala:316) at org.apache.spark.ml.recommendation.ALS$UncompressedInBlockBuilder.build(ALS.scala:1007) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1217) at org.apache.spark.ml.recommendation.ALS$$anonfun$18.apply(ALS.scala:1211) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$43$$anonfun$apply$44.apply(PairRDDFunctions.scala:762) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:216) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 17/11/30 09:43:08 WARN BlockManager: Putting block rdd_39_2 failed due to an exception 17/11/30 09:43:08 WARN BlockManager: Block rdd_39_2 could not be removed as it was not found on disk or in memory 17/11/30 09:43:08 WARN BlockManager: Putting block rdd_39_3 failed due to an exception 17/11/30 09:43:08 WARN BlockManager: Block rdd_39_3 could not be removed as it was not found on disk or in memory 17/11/30 09:43:08 WARN BlockManager: Putting block rdd_39_1 failed due to an exception 17/11/30 09:43:08 WARN BlockManager: Block rdd_39_1 could not be removed as it was not found on disk or in memory 17/11/30 09:43:08 WARN BlockManager: Asked to remove block rdd_39_1, which does not exist 17/11/30 09:43:08 WARN BlockManager: Asked to remove block rdd_39_3, which does not exist 17/11/30 09:43:08 WARN BlockManager: Asked to remove block rdd_39_2, which does not exist