/
log4j.spark.log
549 lines (549 loc) · 54.6 KB
/
log4j.spark.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
18/08/11 16:42:24 INFO SparkContext: Running Spark version 1.6.2
18/08/11 16:42:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/08/11 16:42:25 INFO SecurityManager: Changing view acls to: dan
18/08/11 16:42:25 INFO SecurityManager: Changing modify acls to: dan
18/08/11 16:42:25 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(dan); users with modify permissions: Set(dan)
18/08/11 16:42:25 INFO Utils: Successfully started service 'sparkDriver' on port 46719.
18/08/11 16:42:26 INFO Slf4jLogger: Slf4jLogger started
18/08/11 16:42:26 INFO Remoting: Starting remoting
18/08/11 16:42:26 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@127.0.0.1:37480]
18/08/11 16:42:26 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 37480.
18/08/11 16:42:26 INFO SparkEnv: Registering MapOutputTracker
18/08/11 16:42:26 INFO SparkEnv: Registering BlockManagerMaster
18/08/11 16:42:26 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0afa93e1-d428-49d8-aa09-93da1d0fcf3b
18/08/11 16:42:26 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
18/08/11 16:42:26 INFO SparkEnv: Registering OutputCommitCoordinator
18/08/11 16:42:26 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18/08/11 16:42:26 INFO SparkUI: Started SparkUI at http://127.0.0.1:4040
18/08/11 16:42:26 INFO HttpFileServer: HTTP File server directory is /tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/httpd-28f8332a-80a5-4524-a9b4-fba214766c0a
18/08/11 16:42:26 INFO HttpServer: Starting HTTP Server
18/08/11 16:42:26 INFO Utils: Successfully started service 'HTTP file server' on port 33434.
18/08/11 16:42:26 INFO SparkContext: Added JAR file:/home/dan/R/x86_64-pc-linux-gnu-library/3.4/sparklyr/java/spark-csv_2.11-1.5.0.jar at http://127.0.0.1:33434/jars/spark-csv_2.11-1.5.0.jar with timestamp 1533998546667
18/08/11 16:42:26 INFO SparkContext: Added JAR file:/home/dan/R/x86_64-pc-linux-gnu-library/3.4/sparklyr/java/commons-csv-1.5.jar at http://127.0.0.1:33434/jars/commons-csv-1.5.jar with timestamp 1533998546668
18/08/11 16:42:26 INFO SparkContext: Added JAR file:/home/dan/R/x86_64-pc-linux-gnu-library/3.4/sparklyr/java/univocity-parsers-1.5.1.jar at http://127.0.0.1:33434/jars/univocity-parsers-1.5.1.jar with timestamp 1533998546678
18/08/11 16:42:26 INFO SparkContext: Added JAR file:/home/dan/R/x86_64-pc-linux-gnu-library/3.4/sparklyr/java/sparklyr-1.6-2.10.jar at http://127.0.0.1:33434/jars/sparklyr-1.6-2.10.jar with timestamp 1533998546690
18/08/11 16:42:26 INFO Executor: Starting executor ID driver on host localhost
18/08/11 16:42:26 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40022.
18/08/11 16:42:26 INFO NettyBlockTransferService: Server created on 40022
18/08/11 16:42:26 INFO BlockManagerMaster: Trying to register BlockManager
18/08/11 16:42:26 INFO BlockManagerMasterEndpoint: Registering block manager localhost:40022 with 511.1 MB RAM, BlockManagerId(driver, localhost, 40022)
18/08/11 16:42:26 INFO BlockManagerMaster: Registered BlockManager
18/08/11 16:42:28 INFO HiveContext: Initializing execution hive, version 1.2.1
18/08/11 16:42:28 INFO ClientWrapper: Inspected Hadoop version: 2.6.0
18/08/11 16:42:28 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
18/08/11 16:42:28 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
18/08/11 16:42:29 INFO ObjectStore: ObjectStore, initialize called
18/08/11 16:42:29 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/08/11 16:42:29 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
18/08/11 16:42:29 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
18/08/11 16:42:29 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
18/08/11 16:42:39 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
18/08/11 16:42:41 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:41 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:48 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:48 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:49 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
18/08/11 16:42:49 INFO ObjectStore: Initialized ObjectStore
18/08/11 16:42:50 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
18/08/11 16:42:50 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
18/08/11 16:42:51 INFO HiveMetaStore: Added admin role in metastore
18/08/11 16:42:51 INFO HiveMetaStore: Added public role in metastore
18/08/11 16:42:51 INFO HiveMetaStore: No user is added in admin role, since config is empty
18/08/11 16:42:51 INFO HiveMetaStore: 0: get_all_databases
18/08/11 16:42:51 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_all_databases
18/08/11 16:42:51 INFO HiveMetaStore: 0: get_functions: db=default pat=*
18/08/11 16:42:51 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_functions: db=default pat=*
18/08/11 16:42:51 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:53 INFO SessionState: Created local directory: /tmp/956d0d45-ec5f-411e-80b7-f9ddb5fae627_resources
18/08/11 16:42:53 INFO SessionState: Created HDFS directory: /tmp/hive/dan/956d0d45-ec5f-411e-80b7-f9ddb5fae627
18/08/11 16:42:53 INFO SessionState: Created local directory: /tmp/dan/956d0d45-ec5f-411e-80b7-f9ddb5fae627
18/08/11 16:42:53 INFO SessionState: Created HDFS directory: /tmp/hive/dan/956d0d45-ec5f-411e-80b7-f9ddb5fae627/_tmp_space.db
18/08/11 16:42:53 INFO HiveContext: default warehouse location is /user/hive/warehouse
18/08/11 16:42:53 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
18/08/11 16:42:53 INFO ClientWrapper: Inspected Hadoop version: 2.6.0
18/08/11 16:42:53 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
18/08/11 16:42:54 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
18/08/11 16:42:54 INFO ObjectStore: ObjectStore, initialize called
18/08/11 16:42:54 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/08/11 16:42:54 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
18/08/11 16:42:54 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
18/08/11 16:42:54 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
18/08/11 16:42:56 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
18/08/11 16:42:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:57 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:57 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:58 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
18/08/11 16:42:58 INFO ObjectStore: Initialized ObjectStore
18/08/11 16:42:58 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
18/08/11 16:42:58 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
18/08/11 16:42:58 INFO HiveMetaStore: Added admin role in metastore
18/08/11 16:42:58 INFO HiveMetaStore: Added public role in metastore
18/08/11 16:42:58 INFO HiveMetaStore: No user is added in admin role, since config is empty
18/08/11 16:42:58 INFO HiveMetaStore: 0: get_all_databases
18/08/11 16:42:58 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_all_databases
18/08/11 16:42:58 INFO HiveMetaStore: 0: get_functions: db=default pat=*
18/08/11 16:42:58 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_functions: db=default pat=*
18/08/11 16:42:58 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
18/08/11 16:42:58 INFO SessionState: Created local directory: /tmp/1032f698-908f-444b-91d7-5f0361a5fefd_resources
18/08/11 16:42:58 INFO SessionState: Created HDFS directory: /tmp/hive/dan/1032f698-908f-444b-91d7-5f0361a5fefd
18/08/11 16:42:59 INFO SessionState: Created local directory: /tmp/dan/1032f698-908f-444b-91d7-5f0361a5fefd
18/08/11 16:42:59 INFO SessionState: Created HDFS directory: /tmp/hive/dan/1032f698-908f-444b-91d7-5f0361a5fefd/_tmp_space.db
18/08/11 16:42:59 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:42:59 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:43:00 INFO SparkContext: Starting job: collect at utils.scala:196
18/08/11 16:43:00 INFO DAGScheduler: Got job 0 (collect at utils.scala:196) with 1 output partitions
18/08/11 16:43:00 INFO DAGScheduler: Final stage: ResultStage 0 (collect at utils.scala:196)
18/08/11 16:43:00 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:43:00 INFO DAGScheduler: Missing parents: List()
18/08/11 16:43:00 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at collect at utils.scala:196), which has no missing parents
18/08/11 16:43:00 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1968.0 B, free 1968.0 B)
18/08/11 16:43:00 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1230.0 B, free 3.1 KB)
18/08/11 16:43:00 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:40022 (size: 1230.0 B, free: 511.1 MB)
18/08/11 16:43:00 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
18/08/11 16:43:00 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at collect at utils.scala:196)
18/08/11 16:43:00 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
18/08/11 16:43:00 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:43:00 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
18/08/11 16:43:00 INFO Executor: Fetching http://127.0.0.1:33434/jars/sparklyr-1.6-2.10.jar with timestamp 1533998546690
18/08/11 16:43:00 INFO Utils: Fetching http://127.0.0.1:33434/jars/sparklyr-1.6-2.10.jar to /tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/fetchFileTemp9120604312438464693.tmp
18/08/11 16:43:00 INFO Executor: Adding file:/tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/sparklyr-1.6-2.10.jar to class loader
18/08/11 16:43:00 INFO Executor: Fetching http://127.0.0.1:33434/jars/spark-csv_2.11-1.5.0.jar with timestamp 1533998546667
18/08/11 16:43:00 INFO Utils: Fetching http://127.0.0.1:33434/jars/spark-csv_2.11-1.5.0.jar to /tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/fetchFileTemp4133366730172039938.tmp
18/08/11 16:43:00 INFO Executor: Adding file:/tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/spark-csv_2.11-1.5.0.jar to class loader
18/08/11 16:43:00 INFO Executor: Fetching http://127.0.0.1:33434/jars/univocity-parsers-1.5.1.jar with timestamp 1533998546678
18/08/11 16:43:00 INFO Utils: Fetching http://127.0.0.1:33434/jars/univocity-parsers-1.5.1.jar to /tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/fetchFileTemp8051147936309364000.tmp
18/08/11 16:43:00 INFO Executor: Adding file:/tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/univocity-parsers-1.5.1.jar to class loader
18/08/11 16:43:00 INFO Executor: Fetching http://127.0.0.1:33434/jars/commons-csv-1.5.jar with timestamp 1533998546668
18/08/11 16:43:00 INFO Utils: Fetching http://127.0.0.1:33434/jars/commons-csv-1.5.jar to /tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/fetchFileTemp6354466603310038965.tmp
18/08/11 16:43:00 INFO Executor: Adding file:/tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/userFiles-2400d326-ba61-4bd6-bb82-ea58b984db31/commons-csv-1.5.jar to class loader
18/08/11 16:43:00 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 940 bytes result sent to driver
18/08/11 16:43:00 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 240 ms on localhost (1/1)
18/08/11 16:43:00 INFO DAGScheduler: ResultStage 0 (collect at utils.scala:196) finished in 0.256 s
18/08/11 16:43:00 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
18/08/11 16:43:00 INFO DAGScheduler: Job 0 finished: collect at utils.scala:196, took 0.470238 s
18/08/11 16:43:00 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:43:00 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:43:01 INFO SparkContext: Starting job: collect at utils.scala:196
18/08/11 16:43:01 INFO DAGScheduler: Got job 1 (collect at utils.scala:196) with 1 output partitions
18/08/11 16:43:01 INFO DAGScheduler: Final stage: ResultStage 1 (collect at utils.scala:196)
18/08/11 16:43:01 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:43:01 INFO DAGScheduler: Missing parents: List()
18/08/11 16:43:01 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[3] at collect at utils.scala:196), which has no missing parents
18/08/11 16:43:01 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 1968.0 B, free 5.0 KB)
18/08/11 16:43:01 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1224.0 B, free 6.2 KB)
18/08/11 16:43:01 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:40022 (size: 1224.0 B, free: 511.1 MB)
18/08/11 16:43:01 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
18/08/11 16:43:01 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[3] at collect at utils.scala:196)
18/08/11 16:43:01 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
18/08/11 16:43:01 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:43:01 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
18/08/11 16:43:01 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 940 bytes result sent to driver
18/08/11 16:43:01 INFO DAGScheduler: ResultStage 1 (collect at utils.scala:196) finished in 0.004 s
18/08/11 16:43:01 INFO DAGScheduler: Job 1 finished: collect at utils.scala:196, took 0.020377 s
18/08/11 16:43:01 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 10 ms on localhost (1/1)
18/08/11 16:43:01 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
18/08/11 16:43:17 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:43:17 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:43:17 INFO SparkContext: Starting job: collect at utils.scala:43
18/08/11 16:43:17 INFO DAGScheduler: Got job 2 (collect at utils.scala:43) with 1 output partitions
18/08/11 16:43:17 INFO DAGScheduler: Final stage: ResultStage 2 (collect at utils.scala:43)
18/08/11 16:43:17 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:43:17 INFO DAGScheduler: Missing parents: List()
18/08/11 16:43:17 INFO DAGScheduler: Submitting ResultStage 2 (MapPartitionsRDD[7] at map at utils.scala:40), which has no missing parents
18/08/11 16:43:17 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 5.4 KB, free 11.7 KB)
18/08/11 16:43:17 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 3.0 KB, free 14.7 KB)
18/08/11 16:43:17 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:40022 (size: 3.0 KB, free: 511.1 MB)
18/08/11 16:43:17 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1006
18/08/11 16:43:17 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (MapPartitionsRDD[7] at map at utils.scala:40)
18/08/11 16:43:17 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
18/08/11 16:43:17 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:43:17 INFO Executor: Running task 0.0 in stage 2.0 (TID 2)
18/08/11 16:43:17 INFO GenerateUnsafeProjection: Code generated in 111.174192 ms
18/08/11 16:43:17 INFO Executor: Finished task 0.0 in stage 2.0 (TID 2). 1060 bytes result sent to driver
18/08/11 16:43:17 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 156 ms on localhost (1/1)
18/08/11 16:43:17 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool
18/08/11 16:43:17 INFO DAGScheduler: ResultStage 2 (collect at utils.scala:43) finished in 0.157 s
18/08/11 16:43:17 INFO DAGScheduler: Job 2 finished: collect at utils.scala:43, took 0.165754 s
18/08/11 16:43:17 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 61.8 KB, free 76.4 KB)
18/08/11 16:43:17 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 19.3 KB, free 95.8 KB)
18/08/11 16:43:17 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on localhost:40022 (size: 19.3 KB, free: 511.1 MB)
18/08/11 16:43:17 INFO SparkContext: Created broadcast 3 from textFile at TextFile.scala:30
18/08/11 16:43:55 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:43:55 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:43:55 INFO SparkContext: Starting job: collect at utils.scala:43
18/08/11 16:43:55 INFO DAGScheduler: Got job 3 (collect at utils.scala:43) with 1 output partitions
18/08/11 16:43:55 INFO DAGScheduler: Final stage: ResultStage 3 (collect at utils.scala:43)
18/08/11 16:43:55 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:43:55 INFO DAGScheduler: Missing parents: List()
18/08/11 16:43:55 INFO DAGScheduler: Submitting ResultStage 3 (MapPartitionsRDD[14] at map at utils.scala:40), which has no missing parents
18/08/11 16:43:55 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 5.4 KB, free 101.2 KB)
18/08/11 16:43:55 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 3.0 KB, free 104.2 KB)
18/08/11 16:43:55 INFO BlockManagerInfo: Added broadcast_4_piece0 in memory on localhost:40022 (size: 3.0 KB, free: 511.1 MB)
18/08/11 16:43:55 INFO SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:1006
18/08/11 16:43:55 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 3 (MapPartitionsRDD[14] at map at utils.scala:40)
18/08/11 16:43:55 INFO TaskSchedulerImpl: Adding task set 3.0 with 1 tasks
18/08/11 16:43:55 INFO TaskSetManager: Starting task 0.0 in stage 3.0 (TID 3, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:43:55 INFO Executor: Running task 0.0 in stage 3.0 (TID 3)
18/08/11 16:43:55 INFO Executor: Finished task 0.0 in stage 3.0 (TID 3). 1060 bytes result sent to driver
18/08/11 16:43:55 INFO TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 12 ms on localhost (1/1)
18/08/11 16:43:55 INFO TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool
18/08/11 16:43:55 INFO DAGScheduler: ResultStage 3 (collect at utils.scala:43) finished in 0.013 s
18/08/11 16:43:55 INFO DAGScheduler: Job 3 finished: collect at utils.scala:43, took 0.029593 s
18/08/11 16:43:55 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 208.5 KB, free 312.7 KB)
18/08/11 16:43:55 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 19.3 KB, free 332.0 KB)
18/08/11 16:43:55 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on localhost:40022 (size: 19.3 KB, free: 511.1 MB)
18/08/11 16:43:55 INFO SparkContext: Created broadcast 5 from textFile at TextFile.scala:30
18/08/11 16:44:32 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:44:32 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:44:32 INFO SparkContext: Starting job: collect at utils.scala:43
18/08/11 16:44:32 INFO DAGScheduler: Got job 4 (collect at utils.scala:43) with 1 output partitions
18/08/11 16:44:32 INFO DAGScheduler: Final stage: ResultStage 4 (collect at utils.scala:43)
18/08/11 16:44:32 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:44:32 INFO DAGScheduler: Missing parents: List()
18/08/11 16:44:32 INFO DAGScheduler: Submitting ResultStage 4 (MapPartitionsRDD[21] at map at utils.scala:40), which has no missing parents
18/08/11 16:44:32 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 5.4 KB, free 337.4 KB)
18/08/11 16:44:32 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 3.0 KB, free 340.4 KB)
18/08/11 16:44:32 INFO BlockManagerInfo: Added broadcast_6_piece0 in memory on localhost:40022 (size: 3.0 KB, free: 511.1 MB)
18/08/11 16:44:32 INFO SparkContext: Created broadcast 6 from broadcast at DAGScheduler.scala:1006
18/08/11 16:44:32 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 4 (MapPartitionsRDD[21] at map at utils.scala:40)
18/08/11 16:44:32 INFO TaskSchedulerImpl: Adding task set 4.0 with 1 tasks
18/08/11 16:44:32 INFO TaskSetManager: Starting task 0.0 in stage 4.0 (TID 4, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:44:32 INFO Executor: Running task 0.0 in stage 4.0 (TID 4)
18/08/11 16:44:32 INFO Executor: Finished task 0.0 in stage 4.0 (TID 4). 1060 bytes result sent to driver
18/08/11 16:44:32 INFO TaskSetManager: Finished task 0.0 in stage 4.0 (TID 4) in 9 ms on localhost (1/1)
18/08/11 16:44:32 INFO TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool
18/08/11 16:44:32 INFO DAGScheduler: ResultStage 4 (collect at utils.scala:43) finished in 0.010 s
18/08/11 16:44:32 INFO DAGScheduler: Job 4 finished: collect at utils.scala:43, took 0.022599 s
18/08/11 16:44:32 INFO MemoryStore: Block broadcast_7 stored as values in memory (estimated size 208.5 KB, free 548.9 KB)
18/08/11 16:44:32 INFO MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 19.3 KB, free 568.3 KB)
18/08/11 16:44:32 INFO BlockManagerInfo: Added broadcast_7_piece0 in memory on localhost:40022 (size: 19.3 KB, free: 511.1 MB)
18/08/11 16:44:32 INFO SparkContext: Created broadcast 7 from textFile at TextFile.scala:30
18/08/11 16:44:53 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:44:53 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:44:53 INFO SparkContext: Starting job: collect at utils.scala:43
18/08/11 16:44:53 INFO DAGScheduler: Got job 5 (collect at utils.scala:43) with 1 output partitions
18/08/11 16:44:53 INFO DAGScheduler: Final stage: ResultStage 5 (collect at utils.scala:43)
18/08/11 16:44:53 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:44:53 INFO DAGScheduler: Missing parents: List()
18/08/11 16:44:53 INFO DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[28] at map at utils.scala:40), which has no missing parents
18/08/11 16:44:53 INFO MemoryStore: Block broadcast_8 stored as values in memory (estimated size 5.4 KB, free 573.7 KB)
18/08/11 16:44:53 INFO MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 3.0 KB, free 576.7 KB)
18/08/11 16:44:53 INFO BlockManagerInfo: Added broadcast_8_piece0 in memory on localhost:40022 (size: 3.0 KB, free: 511.1 MB)
18/08/11 16:44:53 INFO SparkContext: Created broadcast 8 from broadcast at DAGScheduler.scala:1006
18/08/11 16:44:53 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 5 (MapPartitionsRDD[28] at map at utils.scala:40)
18/08/11 16:44:53 INFO TaskSchedulerImpl: Adding task set 5.0 with 1 tasks
18/08/11 16:44:53 INFO TaskSetManager: Starting task 0.0 in stage 5.0 (TID 5, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:44:53 INFO Executor: Running task 0.0 in stage 5.0 (TID 5)
18/08/11 16:44:53 INFO Executor: Finished task 0.0 in stage 5.0 (TID 5). 1060 bytes result sent to driver
18/08/11 16:44:53 INFO TaskSetManager: Finished task 0.0 in stage 5.0 (TID 5) in 8 ms on localhost (1/1)
18/08/11 16:44:53 INFO DAGScheduler: ResultStage 5 (collect at utils.scala:43) finished in 0.009 s
18/08/11 16:44:53 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool
18/08/11 16:44:53 INFO DAGScheduler: Job 5 finished: collect at utils.scala:43, took 0.018453 s
18/08/11 16:44:53 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 208.5 KB, free 785.2 KB)
18/08/11 16:44:53 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 19.3 KB, free 804.5 KB)
18/08/11 16:44:53 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on localhost:40022 (size: 19.3 KB, free: 511.0 MB)
18/08/11 16:44:53 INFO SparkContext: Created broadcast 9 from textFile at TextFile.scala:30
18/08/11 16:45:13 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:45:13 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:45:13 INFO SparkContext: Starting job: collect at utils.scala:43
18/08/11 16:45:13 INFO DAGScheduler: Got job 6 (collect at utils.scala:43) with 1 output partitions
18/08/11 16:45:13 INFO DAGScheduler: Final stage: ResultStage 6 (collect at utils.scala:43)
18/08/11 16:45:13 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:45:13 INFO DAGScheduler: Missing parents: List()
18/08/11 16:45:13 INFO DAGScheduler: Submitting ResultStage 6 (MapPartitionsRDD[35] at map at utils.scala:40), which has no missing parents
18/08/11 16:45:13 INFO MemoryStore: Block broadcast_10 stored as values in memory (estimated size 5.4 KB, free 809.9 KB)
18/08/11 16:45:13 INFO MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 3.0 KB, free 812.9 KB)
18/08/11 16:45:13 INFO BlockManagerInfo: Added broadcast_10_piece0 in memory on localhost:40022 (size: 3.0 KB, free: 511.0 MB)
18/08/11 16:45:13 INFO SparkContext: Created broadcast 10 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:13 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 6 (MapPartitionsRDD[35] at map at utils.scala:40)
18/08/11 16:45:13 INFO TaskSchedulerImpl: Adding task set 6.0 with 1 tasks
18/08/11 16:45:13 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 6, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:45:13 INFO Executor: Running task 0.0 in stage 6.0 (TID 6)
18/08/11 16:45:13 INFO Executor: Finished task 0.0 in stage 6.0 (TID 6). 1060 bytes result sent to driver
18/08/11 16:45:13 INFO TaskSetManager: Finished task 0.0 in stage 6.0 (TID 6) in 5 ms on localhost (1/1)
18/08/11 16:45:13 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool
18/08/11 16:45:13 INFO DAGScheduler: ResultStage 6 (collect at utils.scala:43) finished in 0.005 s
18/08/11 16:45:13 INFO DAGScheduler: Job 6 finished: collect at utils.scala:43, took 0.016220 s
18/08/11 16:45:13 INFO MemoryStore: Block broadcast_11 stored as values in memory (estimated size 208.5 KB, free 1021.4 KB)
18/08/11 16:45:13 INFO MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 19.3 KB, free 1040.7 KB)
18/08/11 16:45:13 INFO BlockManagerInfo: Added broadcast_11_piece0 in memory on localhost:40022 (size: 19.3 KB, free: 511.0 MB)
18/08/11 16:45:13 INFO SparkContext: Created broadcast 11 from textFile at TextFile.scala:30
18/08/11 16:45:13 INFO FileInputFormat: Total input paths to process : 1
18/08/11 16:45:13 INFO SparkContext: Starting job: first at CsvRelation.scala:269
18/08/11 16:45:13 INFO DAGScheduler: Got job 7 (first at CsvRelation.scala:269) with 1 output partitions
18/08/11 16:45:13 INFO DAGScheduler: Final stage: ResultStage 7 (first at CsvRelation.scala:269)
18/08/11 16:45:13 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:45:13 INFO DAGScheduler: Missing parents: List()
18/08/11 16:45:13 INFO DAGScheduler: Submitting ResultStage 7 (MapPartitionsRDD[38] at filter at CsvRelation.scala:267), which has no missing parents
18/08/11 16:45:13 INFO MemoryStore: Block broadcast_12 stored as values in memory (estimated size 5.0 KB, free 1045.8 KB)
18/08/11 16:45:13 INFO MemoryStore: Block broadcast_12_piece0 stored as bytes in memory (estimated size 3.0 KB, free 1048.7 KB)
18/08/11 16:45:13 INFO BlockManagerInfo: Added broadcast_12_piece0 in memory on localhost:40022 (size: 3.0 KB, free: 511.0 MB)
18/08/11 16:45:13 INFO SparkContext: Created broadcast 12 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:13 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 7 (MapPartitionsRDD[38] at filter at CsvRelation.scala:267)
18/08/11 16:45:13 INFO TaskSchedulerImpl: Adding task set 7.0 with 1 tasks
18/08/11 16:45:13 INFO TaskSetManager: Starting task 0.0 in stage 7.0 (TID 7, localhost, partition 0,PROCESS_LOCAL, 2422 bytes)
18/08/11 16:45:13 INFO Executor: Running task 0.0 in stage 7.0 (TID 7)
18/08/11 16:45:13 INFO HadoopRDD: Input split: file:/home/dan/Documents/GitArchive/roseTinted/in-grey-seg/array.all.trim.csv:0+32627
18/08/11 16:45:13 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
18/08/11 16:45:13 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
18/08/11 16:45:13 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
18/08/11 16:45:13 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
18/08/11 16:45:13 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
18/08/11 16:45:14 INFO Executor: Finished task 0.0 in stage 7.0 (TID 7). 2494 bytes result sent to driver
18/08/11 16:45:14 INFO TaskSetManager: Finished task 0.0 in stage 7.0 (TID 7) in 65 ms on localhost (1/1)
18/08/11 16:45:14 INFO TaskSchedulerImpl: Removed TaskSet 7.0, whose tasks have all completed, from pool
18/08/11 16:45:14 INFO DAGScheduler: ResultStage 7 (first at CsvRelation.scala:269) finished in 0.065 s
18/08/11 16:45:14 INFO DAGScheduler: Job 7 finished: first at CsvRelation.scala:269, took 0.078872 s
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_13 stored as values in memory (estimated size 208.5 KB, free 1257.2 KB)
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_13_piece0 stored as bytes in memory (estimated size 19.3 KB, free 1276.5 KB)
18/08/11 16:45:14 INFO BlockManagerInfo: Added broadcast_13_piece0 in memory on localhost:40022 (size: 19.3 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO SparkContext: Created broadcast 13 from textFile at TextFile.scala:30
18/08/11 16:45:14 INFO FileInputFormat: Total input paths to process : 1
18/08/11 16:45:14 INFO SparkContext: Starting job: aggregate at InferSchema.scala:41
18/08/11 16:45:14 INFO DAGScheduler: Got job 8 (aggregate at InferSchema.scala:41) with 2 output partitions
18/08/11 16:45:14 INFO DAGScheduler: Final stage: ResultStage 8 (aggregate at InferSchema.scala:41)
18/08/11 16:45:14 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:45:14 INFO DAGScheduler: Missing parents: List()
18/08/11 16:45:14 INFO DAGScheduler: Submitting ResultStage 8 (MapPartitionsRDD[41] at mapPartitions at CsvRelation.scala:92), which has no missing parents
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_14 stored as values in memory (estimated size 7.0 KB, free 1283.6 KB)
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_14_piece0 stored as bytes in memory (estimated size 3.9 KB, free 1287.5 KB)
18/08/11 16:45:14 INFO BlockManagerInfo: Added broadcast_14_piece0 in memory on localhost:40022 (size: 3.9 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO SparkContext: Created broadcast 14 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:14 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 8 (MapPartitionsRDD[41] at mapPartitions at CsvRelation.scala:92)
18/08/11 16:45:14 INFO TaskSchedulerImpl: Adding task set 8.0 with 2 tasks
18/08/11 16:45:14 INFO TaskSetManager: Starting task 0.0 in stage 8.0 (TID 8, localhost, partition 0,PROCESS_LOCAL, 2422 bytes)
18/08/11 16:45:14 INFO TaskSetManager: Starting task 1.0 in stage 8.0 (TID 9, localhost, partition 1,PROCESS_LOCAL, 2422 bytes)
18/08/11 16:45:14 INFO Executor: Running task 0.0 in stage 8.0 (TID 8)
18/08/11 16:45:14 INFO HadoopRDD: Input split: file:/home/dan/Documents/GitArchive/roseTinted/in-grey-seg/array.all.trim.csv:0+32627
18/08/11 16:45:14 INFO Executor: Running task 1.0 in stage 8.0 (TID 9)
18/08/11 16:45:14 INFO HadoopRDD: Input split: file:/home/dan/Documents/GitArchive/roseTinted/in-grey-seg/array.all.trim.csv:32627+32628
18/08/11 16:45:14 INFO Executor: Finished task 1.0 in stage 8.0 (TID 9). 2372 bytes result sent to driver
18/08/11 16:45:14 INFO TaskSetManager: Finished task 1.0 in stage 8.0 (TID 9) in 88 ms on localhost (1/2)
18/08/11 16:45:14 INFO Executor: Finished task 0.0 in stage 8.0 (TID 8). 2372 bytes result sent to driver
18/08/11 16:45:14 INFO TaskSetManager: Finished task 0.0 in stage 8.0 (TID 8) in 105 ms on localhost (2/2)
18/08/11 16:45:14 INFO DAGScheduler: ResultStage 8 (aggregate at InferSchema.scala:41) finished in 0.103 s
18/08/11 16:45:14 INFO TaskSchedulerImpl: Removed TaskSet 8.0, whose tasks have all completed, from pool
18/08/11 16:45:14 INFO DAGScheduler: Job 8 finished: aggregate at InferSchema.scala:41, took 0.113975 s
18/08/11 16:45:14 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:45:14 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:45:14 INFO SparkContext: Starting job: collect at utils.scala:196
18/08/11 16:45:14 INFO DAGScheduler: Got job 9 (collect at utils.scala:196) with 1 output partitions
18/08/11 16:45:14 INFO DAGScheduler: Final stage: ResultStage 9 (collect at utils.scala:196)
18/08/11 16:45:14 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:45:14 INFO DAGScheduler: Missing parents: List()
18/08/11 16:45:14 INFO DAGScheduler: Submitting ResultStage 9 (MapPartitionsRDD[43] at collect at utils.scala:196), which has no missing parents
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_15 stored as values in memory (estimated size 1968.0 B, free 1289.4 KB)
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_15_piece0 stored as bytes in memory (estimated size 1224.0 B, free 1290.6 KB)
18/08/11 16:45:14 INFO BlockManagerInfo: Added broadcast_15_piece0 in memory on localhost:40022 (size: 1224.0 B, free: 511.0 MB)
18/08/11 16:45:14 INFO SparkContext: Created broadcast 15 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:14 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 9 (MapPartitionsRDD[43] at collect at utils.scala:196)
18/08/11 16:45:14 INFO TaskSchedulerImpl: Adding task set 9.0 with 1 tasks
18/08/11 16:45:14 INFO TaskSetManager: Starting task 0.0 in stage 9.0 (TID 10, localhost, partition 0,PROCESS_LOCAL, 2348 bytes)
18/08/11 16:45:14 INFO Executor: Running task 0.0 in stage 9.0 (TID 10)
18/08/11 16:45:14 INFO Executor: Finished task 0.0 in stage 9.0 (TID 10). 940 bytes result sent to driver
18/08/11 16:45:14 INFO TaskSetManager: Finished task 0.0 in stage 9.0 (TID 10) in 6 ms on localhost (1/1)
18/08/11 16:45:14 INFO TaskSchedulerImpl: Removed TaskSet 9.0, whose tasks have all completed, from pool
18/08/11 16:45:14 INFO DAGScheduler: ResultStage 9 (collect at utils.scala:196) finished in 0.003 s
18/08/11 16:45:14 INFO DAGScheduler: Job 9 finished: collect at utils.scala:196, took 0.013711 s
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_16 stored as values in memory (estimated size 208.5 KB, free 1499.1 KB)
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_16_piece0 stored as bytes in memory (estimated size 19.3 KB, free 1518.4 KB)
18/08/11 16:45:14 INFO BlockManagerInfo: Added broadcast_16_piece0 in memory on localhost:40022 (size: 19.3 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO SparkContext: Created broadcast 16 from textFile at TextFile.scala:30
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_7_piece0 on localhost:40022 in memory (size: 19.3 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO FileInputFormat: Total input paths to process : 1
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_6_piece0 on localhost:40022 in memory (size: 3.0 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 9
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 8
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_5_piece0 on localhost:40022 in memory (size: 19.3 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_4_piece0 on localhost:40022 in memory (size: 3.0 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 7
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 6
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_3_piece0 on localhost:40022 in memory (size: 19.3 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_2_piece0 on localhost:40022 in memory (size: 3.0 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 5
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 4
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_1_piece0 on localhost:40022 in memory (size: 1224.0 B, free: 511.0 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 2
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_0_piece0 on localhost:40022 in memory (size: 1230.0 B, free: 511.0 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 1
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_15_piece0 on localhost:40022 in memory (size: 1224.0 B, free: 511.0 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 16
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_14_piece0 on localhost:40022 in memory (size: 3.9 KB, free: 511.0 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 15
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_13_piece0 on localhost:40022 in memory (size: 19.3 KB, free: 511.1 MB)
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_12_piece0 on localhost:40022 in memory (size: 3.0 KB, free: 511.1 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 14
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_11_piece0 on localhost:40022 in memory (size: 19.3 KB, free: 511.1 MB)
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_10_piece0 on localhost:40022 in memory (size: 3.0 KB, free: 511.1 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 13
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 12
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_9_piece0 on localhost:40022 in memory (size: 19.3 KB, free: 511.1 MB)
18/08/11 16:45:14 INFO BlockManagerInfo: Removed broadcast_8_piece0 on localhost:40022 in memory (size: 3.0 KB, free: 511.1 MB)
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 11
18/08/11 16:45:14 INFO ContextCleaner: Cleaned accumulator 10
18/08/11 16:45:14 INFO SparkContext: Starting job: sql at NativeMethodAccessorImpl.java:-2
18/08/11 16:45:14 INFO DAGScheduler: Registering RDD 53 (sql at NativeMethodAccessorImpl.java:-2)
18/08/11 16:45:14 INFO DAGScheduler: Got job 10 (sql at NativeMethodAccessorImpl.java:-2) with 1 output partitions
18/08/11 16:45:14 INFO DAGScheduler: Final stage: ResultStage 11 (sql at NativeMethodAccessorImpl.java:-2)
18/08/11 16:45:14 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 10)
18/08/11 16:45:14 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 10)
18/08/11 16:45:14 INFO DAGScheduler: Submitting ShuffleMapStage 10 (MapPartitionsRDD[53] at sql at NativeMethodAccessorImpl.java:-2), which has no missing parents
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_17 stored as values in memory (estimated size 38.5 KB, free 266.3 KB)
18/08/11 16:45:14 INFO MemoryStore: Block broadcast_17_piece0 stored as bytes in memory (estimated size 14.6 KB, free 280.9 KB)
18/08/11 16:45:14 INFO BlockManagerInfo: Added broadcast_17_piece0 in memory on localhost:40022 (size: 14.6 KB, free: 511.1 MB)
18/08/11 16:45:14 INFO SparkContext: Created broadcast 17 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:14 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 10 (MapPartitionsRDD[53] at sql at NativeMethodAccessorImpl.java:-2)
18/08/11 16:45:14 INFO TaskSchedulerImpl: Adding task set 10.0 with 2 tasks
18/08/11 16:45:14 INFO TaskSetManager: Starting task 0.0 in stage 10.0 (TID 11, localhost, partition 0,PROCESS_LOCAL, 2411 bytes)
18/08/11 16:45:14 INFO TaskSetManager: Starting task 1.0 in stage 10.0 (TID 12, localhost, partition 1,PROCESS_LOCAL, 2411 bytes)
18/08/11 16:45:14 INFO Executor: Running task 0.0 in stage 10.0 (TID 11)
18/08/11 16:45:14 INFO Executor: Running task 1.0 in stage 10.0 (TID 12)
18/08/11 16:45:14 INFO CacheManager: Partition rdd_50_0 not found, computing it
18/08/11 16:45:14 INFO HadoopRDD: Input split: file:/home/dan/Documents/GitArchive/roseTinted/in-grey-seg/array.all.trim.csv:0+32627
18/08/11 16:45:14 INFO CacheManager: Partition rdd_50_1 not found, computing it
18/08/11 16:45:14 INFO HadoopRDD: Input split: file:/home/dan/Documents/GitArchive/roseTinted/in-grey-seg/array.all.trim.csv:32627+32628
18/08/11 16:45:14 INFO GenerateUnsafeProjection: Code generated in 65.370387 ms
18/08/11 16:45:15 INFO MemoryStore: Block rdd_50_1 stored as values in memory (estimated size 18.1 KB, free 299.0 KB)
18/08/11 16:45:15 INFO BlockManagerInfo: Added rdd_50_1 in memory on localhost:40022 (size: 18.1 KB, free: 511.1 MB)
18/08/11 16:45:15 INFO MemoryStore: Block rdd_50_0 stored as values in memory (estimated size 18.3 KB, free 317.3 KB)
18/08/11 16:45:15 INFO BlockManagerInfo: Added rdd_50_0 in memory on localhost:40022 (size: 18.3 KB, free: 511.1 MB)
18/08/11 16:45:15 INFO GeneratePredicate: Code generated in 5.955101 ms
18/08/11 16:45:15 INFO GenerateColumnAccessor: Code generated in 27.563098 ms
18/08/11 16:45:15 INFO GenerateMutableProjection: Code generated in 10.968039 ms
18/08/11 16:45:15 INFO GenerateUnsafeProjection: Code generated in 10.09768 ms
18/08/11 16:45:15 INFO GenerateMutableProjection: Code generated in 15.577504 ms
18/08/11 16:45:15 INFO GenerateUnsafeRowJoiner: Code generated in 9.550186 ms
18/08/11 16:45:15 INFO GenerateUnsafeProjection: Code generated in 9.084701 ms
18/08/11 16:45:15 INFO Executor: Finished task 0.0 in stage 10.0 (TID 11). 5155 bytes result sent to driver
18/08/11 16:45:15 INFO Executor: Finished task 1.0 in stage 10.0 (TID 12). 5155 bytes result sent to driver
18/08/11 16:45:15 INFO TaskSetManager: Finished task 0.0 in stage 10.0 (TID 11) in 513 ms on localhost (1/2)
18/08/11 16:45:15 INFO TaskSetManager: Finished task 1.0 in stage 10.0 (TID 12) in 510 ms on localhost (2/2)
18/08/11 16:45:15 INFO TaskSchedulerImpl: Removed TaskSet 10.0, whose tasks have all completed, from pool
18/08/11 16:45:15 INFO DAGScheduler: ShuffleMapStage 10 (sql at NativeMethodAccessorImpl.java:-2) finished in 0.514 s
18/08/11 16:45:15 INFO DAGScheduler: looking for newly runnable stages
18/08/11 16:45:15 INFO DAGScheduler: running: Set()
18/08/11 16:45:15 INFO DAGScheduler: waiting: Set(ResultStage 11)
18/08/11 16:45:15 INFO DAGScheduler: failed: Set()
18/08/11 16:45:15 INFO DAGScheduler: Submitting ResultStage 11 (MapPartitionsRDD[56] at sql at NativeMethodAccessorImpl.java:-2), which has no missing parents
18/08/11 16:45:15 INFO MemoryStore: Block broadcast_18 stored as values in memory (estimated size 9.3 KB, free 326.6 KB)
18/08/11 16:45:15 INFO MemoryStore: Block broadcast_18_piece0 stored as bytes in memory (estimated size 4.6 KB, free 331.2 KB)
18/08/11 16:45:15 INFO BlockManagerInfo: Added broadcast_18_piece0 in memory on localhost:40022 (size: 4.6 KB, free: 511.1 MB)
18/08/11 16:45:15 INFO SparkContext: Created broadcast 18 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:15 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 11 (MapPartitionsRDD[56] at sql at NativeMethodAccessorImpl.java:-2)
18/08/11 16:45:15 INFO TaskSchedulerImpl: Adding task set 11.0 with 1 tasks
18/08/11 16:45:15 INFO TaskSetManager: Starting task 0.0 in stage 11.0 (TID 13, localhost, partition 0,NODE_LOCAL, 2242 bytes)
18/08/11 16:45:15 INFO Executor: Running task 0.0 in stage 11.0 (TID 13)
18/08/11 16:45:15 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
18/08/11 16:45:15 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 4 ms
18/08/11 16:45:15 INFO GenerateMutableProjection: Code generated in 10.206236 ms
18/08/11 16:45:15 INFO GenerateMutableProjection: Code generated in 9.211049 ms
18/08/11 16:45:15 INFO Executor: Finished task 0.0 in stage 11.0 (TID 13). 1830 bytes result sent to driver
18/08/11 16:45:15 INFO TaskSetManager: Finished task 0.0 in stage 11.0 (TID 13) in 139 ms on localhost (1/1)
18/08/11 16:45:15 INFO TaskSchedulerImpl: Removed TaskSet 11.0, whose tasks have all completed, from pool
18/08/11 16:45:15 INFO DAGScheduler: ResultStage 11 (sql at NativeMethodAccessorImpl.java:-2) finished in 0.140 s
18/08/11 16:45:15 INFO DAGScheduler: Job 10 finished: sql at NativeMethodAccessorImpl.java:-2, took 0.722466 s
18/08/11 16:45:15 INFO ParseDriver: Parsing command: SELECT count(*) FROM `featLib`
18/08/11 16:45:16 INFO ParseDriver: Parse Completed
18/08/11 16:45:16 INFO SparkContext: Starting job: collect at utils.scala:196
18/08/11 16:45:16 INFO DAGScheduler: Registering RDD 60 (collect at utils.scala:196)
18/08/11 16:45:16 INFO DAGScheduler: Got job 11 (collect at utils.scala:196) with 1 output partitions
18/08/11 16:45:16 INFO DAGScheduler: Final stage: ResultStage 13 (collect at utils.scala:196)
18/08/11 16:45:16 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 12)
18/08/11 16:45:16 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 12)
18/08/11 16:45:16 INFO DAGScheduler: Submitting ShuffleMapStage 12 (MapPartitionsRDD[60] at collect at utils.scala:196), which has no missing parents
18/08/11 16:45:16 INFO MemoryStore: Block broadcast_19 stored as values in memory (estimated size 38.6 KB, free 369.8 KB)
18/08/11 16:45:16 INFO MemoryStore: Block broadcast_19_piece0 stored as bytes in memory (estimated size 14.7 KB, free 384.4 KB)
18/08/11 16:45:16 INFO BlockManagerInfo: Added broadcast_19_piece0 in memory on localhost:40022 (size: 14.7 KB, free: 511.0 MB)
18/08/11 16:45:16 INFO SparkContext: Created broadcast 19 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:16 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 12 (MapPartitionsRDD[60] at collect at utils.scala:196)
18/08/11 16:45:16 INFO TaskSchedulerImpl: Adding task set 12.0 with 2 tasks
18/08/11 16:45:16 INFO TaskSetManager: Starting task 0.0 in stage 12.0 (TID 14, localhost, partition 0,PROCESS_LOCAL, 2411 bytes)
18/08/11 16:45:16 INFO TaskSetManager: Starting task 1.0 in stage 12.0 (TID 15, localhost, partition 1,PROCESS_LOCAL, 2411 bytes)
18/08/11 16:45:16 INFO Executor: Running task 0.0 in stage 12.0 (TID 14)
18/08/11 16:45:16 INFO Executor: Running task 1.0 in stage 12.0 (TID 15)
18/08/11 16:45:16 INFO BlockManager: Found block rdd_50_1 locally
18/08/11 16:45:16 INFO BlockManager: Found block rdd_50_0 locally
18/08/11 16:45:16 INFO Executor: Finished task 1.0 in stage 12.0 (TID 15). 2722 bytes result sent to driver
18/08/11 16:45:16 INFO TaskSetManager: Finished task 1.0 in stage 12.0 (TID 15) in 21 ms on localhost (1/2)
18/08/11 16:45:16 INFO Executor: Finished task 0.0 in stage 12.0 (TID 14). 2722 bytes result sent to driver
18/08/11 16:45:16 INFO TaskSetManager: Finished task 0.0 in stage 12.0 (TID 14) in 25 ms on localhost (2/2)
18/08/11 16:45:16 INFO TaskSchedulerImpl: Removed TaskSet 12.0, whose tasks have all completed, from pool
18/08/11 16:45:16 INFO DAGScheduler: ShuffleMapStage 12 (collect at utils.scala:196) finished in 0.024 s
18/08/11 16:45:16 INFO DAGScheduler: looking for newly runnable stages
18/08/11 16:45:16 INFO DAGScheduler: running: Set()
18/08/11 16:45:16 INFO DAGScheduler: waiting: Set(ResultStage 13)
18/08/11 16:45:16 INFO DAGScheduler: failed: Set()
18/08/11 16:45:16 INFO DAGScheduler: Submitting ResultStage 13 (MapPartitionsRDD[63] at collect at utils.scala:196), which has no missing parents
18/08/11 16:45:16 INFO MemoryStore: Block broadcast_20 stored as values in memory (estimated size 9.4 KB, free 393.8 KB)
18/08/11 16:45:16 INFO MemoryStore: Block broadcast_20_piece0 stored as bytes in memory (estimated size 4.6 KB, free 398.5 KB)
18/08/11 16:45:16 INFO BlockManagerInfo: Added broadcast_20_piece0 in memory on localhost:40022 (size: 4.6 KB, free: 511.0 MB)
18/08/11 16:45:16 INFO SparkContext: Created broadcast 20 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:16 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 13 (MapPartitionsRDD[63] at collect at utils.scala:196)
18/08/11 16:45:16 INFO TaskSchedulerImpl: Adding task set 13.0 with 1 tasks
18/08/11 16:45:16 INFO TaskSetManager: Starting task 0.0 in stage 13.0 (TID 16, localhost, partition 0,NODE_LOCAL, 2242 bytes)
18/08/11 16:45:16 INFO Executor: Running task 0.0 in stage 13.0 (TID 16)
18/08/11 16:45:16 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
18/08/11 16:45:16 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
18/08/11 16:45:16 INFO Executor: Finished task 0.0 in stage 13.0 (TID 16). 1830 bytes result sent to driver
18/08/11 16:45:16 INFO TaskSetManager: Finished task 0.0 in stage 13.0 (TID 16) in 18 ms on localhost (1/1)
18/08/11 16:45:16 INFO TaskSchedulerImpl: Removed TaskSet 13.0, whose tasks have all completed, from pool
18/08/11 16:45:16 INFO DAGScheduler: ResultStage 13 (collect at utils.scala:196) finished in 0.019 s
18/08/11 16:45:16 INFO DAGScheduler: Job 11 finished: collect at utils.scala:196, took 0.071947 s
18/08/11 16:45:16 INFO ParseDriver: Parsing command: SELECT *
FROM `featLib` AS `zzz1`
WHERE (0 = 1)
18/08/11 16:45:16 INFO ParseDriver: Parse Completed
18/08/11 16:45:17 INFO HiveMetaStore: 0: get_tables: db=default pat=.*
18/08/11 16:45:17 INFO audit: ugi=dan ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
18/08/11 16:45:17 INFO SparkContext: Starting job: collect at utils.scala:196
18/08/11 16:45:17 INFO DAGScheduler: Got job 12 (collect at utils.scala:196) with 1 output partitions
18/08/11 16:45:17 INFO DAGScheduler: Final stage: ResultStage 14 (collect at utils.scala:196)
18/08/11 16:45:17 INFO DAGScheduler: Parents of final stage: List()
18/08/11 16:45:17 INFO DAGScheduler: Missing parents: List()
18/08/11 16:45:17 INFO DAGScheduler: Submitting ResultStage 14 (MapPartitionsRDD[65] at collect at utils.scala:196), which has no missing parents
18/08/11 16:45:17 INFO MemoryStore: Block broadcast_21 stored as values in memory (estimated size 1968.0 B, free 400.4 KB)
18/08/11 16:45:17 INFO MemoryStore: Block broadcast_21_piece0 stored as bytes in memory (estimated size 1224.0 B, free 401.6 KB)
18/08/11 16:45:17 INFO BlockManagerInfo: Added broadcast_21_piece0 in memory on localhost:40022 (size: 1224.0 B, free: 511.0 MB)
18/08/11 16:45:17 INFO SparkContext: Created broadcast 21 from broadcast at DAGScheduler.scala:1006
18/08/11 16:45:17 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 14 (MapPartitionsRDD[65] at collect at utils.scala:196)
18/08/11 16:45:17 INFO TaskSchedulerImpl: Adding task set 14.0 with 1 tasks
18/08/11 16:45:17 INFO TaskSetManager: Starting task 0.0 in stage 14.0 (TID 17, localhost, partition 0,PROCESS_LOCAL, 2649 bytes)
18/08/11 16:45:17 INFO Executor: Running task 0.0 in stage 14.0 (TID 17)
18/08/11 16:45:17 INFO Executor: Finished task 0.0 in stage 14.0 (TID 17). 1261 bytes result sent to driver
18/08/11 16:45:17 INFO TaskSetManager: Finished task 0.0 in stage 14.0 (TID 17) in 5 ms on localhost (1/1)
18/08/11 16:45:17 INFO TaskSchedulerImpl: Removed TaskSet 14.0, whose tasks have all completed, from pool
18/08/11 16:45:17 INFO DAGScheduler: ResultStage 14 (collect at utils.scala:196) finished in 0.006 s
18/08/11 16:45:17 INFO DAGScheduler: Job 12 finished: collect at utils.scala:196, took 0.013195 s
18/08/11 16:45:28 INFO ParseDriver: Parsing command: SELECT *
FROM `featLib`
18/08/11 16:45:28 INFO ParseDriver: Parse Completed
18/08/11 16:45:28 INFO ParseDriver: Parsing command: SELECT *
FROM `sparklyr_tmp_1d1cd6dfbfc` AS `zzz2`
WHERE (0 = 1)
18/08/11 16:45:28 INFO ParseDriver: Parse Completed
18/08/11 16:45:28 INFO ParseDriver: Parsing command: SELECT *
FROM `sparklyr_tmp_1d1c75e97deb` AS `zzz3`
WHERE (0 = 1)
18/08/11 16:45:28 INFO ParseDriver: Parse Completed
18/08/11 16:45:54 INFO ParseDriver: Parsing command: SELECT *
FROM `sparklyr_tmp_1d1cd6dfbfc`
18/08/11 16:45:54 INFO ParseDriver: Parse Completed
18/08/11 16:46:48 INFO ParseDriver: Parsing command: SELECT *
FROM `sparklyr_tmp_1d1cd6dfbfc`
18/08/11 16:46:48 INFO ParseDriver: Parse Completed
18/08/11 16:46:49 INFO ParseDriver: Parsing command: SELECT *
FROM `sparklyr_tmp_1d1c75e97deb`
18/08/11 16:46:49 INFO ParseDriver: Parse Completed
18/08/11 16:48:16 INFO SparkContext: Invoking stop() from shutdown hook
18/08/11 16:48:16 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4040
18/08/11 16:48:16 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/08/11 16:48:16 INFO MemoryStore: MemoryStore cleared
18/08/11 16:48:16 INFO BlockManager: BlockManager stopped
18/08/11 16:48:16 INFO BlockManagerMaster: BlockManagerMaster stopped
18/08/11 16:48:16 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/08/11 16:48:16 INFO SparkContext: Successfully stopped SparkContext
18/08/11 16:48:16 INFO ShutdownHookManager: Shutdown hook called
18/08/11 16:48:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d
18/08/11 16:48:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-0b782747-c1c8-4002-bb88-14977927e18d/httpd-28f8332a-80a5-4524-a9b4-fba214766c0a
18/08/11 16:48:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-299f804a-79c6-4943-9b0d-f78828100858