*** Reading local file: /usr/local/airflow/logs/spark-test/spark_job/2020-11-02T09:30:27.579063+00:00/2.log [2020-11-02 09:32:35,667] {{taskinstance.py:655}} INFO - Dependencies all met for [2020-11-02 09:32:35,676] {{taskinstance.py:655}} INFO - Dependencies all met for [2020-11-02 09:32:35,676] {{taskinstance.py:866}} INFO - -------------------------------------------------------------------------------- [2020-11-02 09:32:35,676] {{taskinstance.py:867}} INFO - Starting attempt 2 of 2 [2020-11-02 09:32:35,676] {{taskinstance.py:868}} INFO - -------------------------------------------------------------------------------- [2020-11-02 09:32:35,682] {{taskinstance.py:887}} INFO - Executing on 2020-11-02T09:30:27.579063+00:00 [2020-11-02 09:32:35,683] {{standard_task_runner.py:52}} INFO - Started process 3698 to run task [2020-11-02 09:32:35,701] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:35,701] {{dagbag.py:403}} INFO - Filling up the DagBag from /usr/local/airflow/dags/spark-test.py [2020-11-02 09:32:35,713] {{logging_mixin.py:112}} INFO - Running %s on host %s a74249095542 [2020-11-02 09:32:35,751] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:35,750] {{base_hook.py:84}} INFO - Using connection to: id: spark_default. Host: spark://spark, Port: 7077, Schema: None, Login: None, Password: None, extra: XXXXXXXX [2020-11-02 09:32:35,752] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:35,751] {{spark_submit_hook.py:305}} INFO - Spark-Submit cmd: ['spark-submit', '--master', 'spark://spark:7077', '--conf', 'spark.master=spark://spark:7077', '--name', 'Spark Hello World', '--verbose', '--queue', 'root.default', '/usr/local/spark/app/hello-world.py', '/usr/local/spark/resources/data/airflow.cfg'] [2020-11-02 09:32:36,083] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,083] {{spark_submit_hook.py:436}} INFO - Using properties file: null [2020-11-02 09:32:36,117] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,117] {{spark_submit_hook.py:436}} INFO - Parsed arguments: [2020-11-02 09:32:36,117] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,117] {{spark_submit_hook.py:436}} INFO - master spark://spark:7077 [2020-11-02 09:32:36,117] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,117] {{spark_submit_hook.py:436}} INFO - deployMode null [2020-11-02 09:32:36,117] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,117] {{spark_submit_hook.py:436}} INFO - executorMemory null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - executorCores null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - totalExecutorCores null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - propertiesFile null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - driverMemory null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - driverCores null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - driverExtraClassPath null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - driverExtraLibraryPath null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - driverExtraJavaOptions null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - supervise false [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - queue root.default [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - numExecutors null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - files null [2020-11-02 09:32:36,118] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,118] {{spark_submit_hook.py:436}} INFO - pyFiles null [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - archives null [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - mainClass null [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - primaryResource file:/usr/local/spark/app/hello-world.py [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - name Spark Hello World [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - childArgs [/usr/local/spark/resources/data/airflow.cfg] [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - jars null [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - packages null [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - packagesExclusions null [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - repositories null [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - verbose true [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - Spark properties used, including those specified through [2020-11-02 09:32:36,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - --conf and those from the properties file null: [2020-11-02 09:32:36,120] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,119] {{spark_submit_hook.py:436}} INFO - (spark.master,spark://spark:7077) [2020-11-02 09:32:36,120] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,120] {{spark_submit_hook.py:436}} INFO - [2020-11-02 09:32:36,120] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,120] {{spark_submit_hook.py:436}} INFO - [2020-11-02 09:32:36,338] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,338] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable [2020-11-02 09:32:36,506] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,506] {{spark_submit_hook.py:436}} INFO - Main class: [2020-11-02 09:32:36,506] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,506] {{spark_submit_hook.py:436}} INFO - org.apache.spark.deploy.PythonRunner [2020-11-02 09:32:36,506] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,506] {{spark_submit_hook.py:436}} INFO - Arguments: [2020-11-02 09:32:36,506] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,506] {{spark_submit_hook.py:436}} INFO - file:/usr/local/spark/app/hello-world.py [2020-11-02 09:32:36,506] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,506] {{spark_submit_hook.py:436}} INFO - null [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - /usr/local/spark/resources/data/airflow.cfg [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - Spark config: [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - (spark.master,spark://spark:7077) [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - (spark.app.name,Spark Hello World) [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - (spark.submit.deployMode,client) [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - Classpath elements: [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - [2020-11-02 09:32:36,507] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,507] {{spark_submit_hook.py:436}} INFO - [2020-11-02 09:32:36,728] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,728] {{spark_submit_hook.py:436}} INFO - Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties [2020-11-02 09:32:36,729] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,729] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SparkContext: Running Spark version 2.4.5 [2020-11-02 09:32:36,741] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,741] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SparkContext: Submitted application: Spark Hello World [2020-11-02 09:32:36,764] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,764] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SecurityManager: Changing view acls to: airflow [2020-11-02 09:32:36,764] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,764] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SecurityManager: Changing modify acls to: airflow [2020-11-02 09:32:36,764] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,764] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SecurityManager: Changing view acls groups to: [2020-11-02 09:32:36,764] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,764] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SecurityManager: Changing modify acls groups to: [2020-11-02 09:32:36,764] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,764] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(airflow); groups with view permissions: Set(); users with modify permissions: Set(airflow); groups with modify permissions: Set() [2020-11-02 09:32:36,923] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,923] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO Utils: Successfully started service 'sparkDriver' on port 43947. [2020-11-02 09:32:36,934] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,934] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SparkEnv: Registering MapOutputTracker [2020-11-02 09:32:36,942] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,942] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SparkEnv: Registering BlockManagerMaster [2020-11-02 09:32:36,943] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,943] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information [2020-11-02 09:32:36,944] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,943] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up [2020-11-02 09:32:36,948] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,948] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d42f5ac6-059a-4934-8fb8-d913a8798426 [2020-11-02 09:32:36,956] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,955] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO MemoryStore: MemoryStore started with capacity 366.3 MB [2020-11-02 09:32:36,962] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:36,962] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:36 INFO SparkEnv: Registering OutputCommitCoordinator [2020-11-02 09:32:37,043] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:37,043] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:37 INFO Utils: Successfully started service 'SparkUI' on port 4040. [2020-11-02 09:32:37,066] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:37,066] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:37 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://a74249095542:4040 [2020-11-02 09:32:37,114] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:37,113] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:37 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark:7077... [2020-11-02 09:32:37,143] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:37,142] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:37 INFO TransportClientFactory: Successfully created connection to spark/172.19.0.3:7077 after 17 ms (0 ms spent in bootstraps) [2020-11-02 09:32:57,115] {{logging_mixin.py:112}} INFO - [2020-11-02 09:32:57,115] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:32:57 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark:7077... [2020-11-02 09:33:17,115] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:17,114] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:17 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark:7077... [2020-11-02 09:33:37,116] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,116] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. [2020-11-02 09:33:37,116] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,116] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 WARN StandaloneSchedulerBackend: Application ID is not initialized yet. [2020-11-02 09:33:37,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,119] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42453. [2020-11-02 09:33:37,119] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,119] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO NettyBlockTransferService: Server created on a74249095542:42453 [2020-11-02 09:33:37,120] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,120] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy [2020-11-02 09:33:37,121] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,121] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO SparkUI: Stopped Spark web UI at http://a74249095542:4040 [2020-11-02 09:33:37,122] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,122] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO StandaloneSchedulerBackend: Shutting down all executors [2020-11-02 09:33:37,124] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,124] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down [2020-11-02 09:33:37,126] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,126] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master [2020-11-02 09:33:37,128] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,128] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! [2020-11-02 09:33:37,136] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,136] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, a74249095542, 42453, None) [2020-11-02 09:33:37,138] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,138] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO BlockManagerMasterEndpoint: Registering block manager a74249095542:42453 with 366.3 MB RAM, BlockManagerId(driver, a74249095542, 42453, None) [2020-11-02 09:33:37,139] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,138] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO MemoryStore: MemoryStore cleared [2020-11-02 09:33:37,139] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,139] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO BlockManager: BlockManager stopped [2020-11-02 09:33:37,139] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,139] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, a74249095542, 42453, None) [2020-11-02 09:33:37,139] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,139] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, a74249095542, 42453, None) [2020-11-02 09:33:37,141] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,141] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO BlockManagerMaster: BlockManagerMaster stopped [2020-11-02 09:33:37,143] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,143] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! [2020-11-02 09:33:37,146] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,146] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO SparkContext: Successfully stopped SparkContext [2020-11-02 09:33:37,206] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,206] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 ERROR SparkContext: Error initializing SparkContext. [2020-11-02 09:33:37,206] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,206] {{spark_submit_hook.py:436}} INFO - java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem [2020-11-02 09:33:37,206] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,206] {{spark_submit_hook.py:436}} INFO - at scala.Predef$.require(Predef.scala:224) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at org.apache.spark.SparkContext.(SparkContext.scala:516) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at py4j.Gateway.invoke(Gateway.java:238) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at py4j.GatewayConnection.run(GatewayConnection.java:238) [2020-11-02 09:33:37,207] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - at java.lang.Thread.run(Thread.java:748) [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,207] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO SparkContext: SparkContext already stopped. [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - Traceback (most recent call last): [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - File "/usr/local/spark/app/hello-world.py", line 14, in [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - sc = SparkContext() [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - File "/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 136, in __init__ [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - conf, jsc, profiler_cls) [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - File "/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 198, in _do_init [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - self._jsc = jsc or self._initialize_context(self._conf._jconf) [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - File "/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 315, in _initialize_context [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - return self._jvm.JavaSparkContext(jconf) [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - File "/usr/local/lib/python3.6/site-packages/py4j/java_gateway.py", line 1569, in __call__ [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - answer, self._gateway_client, None, self._fqn) [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - File "/usr/local/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value [2020-11-02 09:33:37,208] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - format(target_id, ".", name), value) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,208] {{spark_submit_hook.py:436}} INFO - py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at scala.Predef$.require(Predef.scala:224) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at org.apache.spark.SparkContext.(SparkContext.scala:516) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at py4j.Gateway.invoke(Gateway.java:238) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) [2020-11-02 09:33:37,209] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,209] {{spark_submit_hook.py:436}} INFO - at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) [2020-11-02 09:33:37,210] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,210] {{spark_submit_hook.py:436}} INFO - at py4j.GatewayConnection.run(GatewayConnection.java:238) [2020-11-02 09:33:37,210] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,210] {{spark_submit_hook.py:436}} INFO - at java.lang.Thread.run(Thread.java:748) [2020-11-02 09:33:37,210] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,210] {{spark_submit_hook.py:436}} INFO - [2020-11-02 09:33:37,227] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,227] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO ShutdownHookManager: Shutdown hook called [2020-11-02 09:33:37,227] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,227] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-9d2539c1-4bb4-4166-8f50-b52d78e56ed3 [2020-11-02 09:33:37,229] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:37,229] {{spark_submit_hook.py:436}} INFO - 20/11/02 09:33:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-8a95a18e-82d9-4dc2-828a-2e938b66eb05 [2020-11-02 09:33:37,559] {{taskinstance.py:1088}} ERROR - Cannot execute: ['spark-submit', '--master', 'spark://spark:7077', '--conf', 'spark.master=spark://spark:7077', '--name', 'Spark Hello World', '--verbose', '--queue', 'root.default', '/usr/local/spark/app/hello-world.py', '/usr/local/spark/resources/data/airflow.cfg']. Error code is: 1. Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 955, in _run_raw_task result = task_copy.execute(context=context) File "/usr/local/lib/python3.6/site-packages/airflow/contrib/operators/spark_submit_operator.py", line 181, in execute self._hook.submit(self._application) File "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/spark_submit_hook.py", line 362, in submit spark_submit_cmd, returncode airflow.exceptions.AirflowException: Cannot execute: ['spark-submit', '--master', 'spark://spark:7077', '--conf', 'spark.master=spark://spark:7077', '--name', 'Spark Hello World', '--verbose', '--queue', 'root.default', '/usr/local/spark/app/hello-world.py', '/usr/local/spark/resources/data/airflow.cfg']. Error code is: 1. [2020-11-02 09:33:37,560] {{taskinstance.py:1117}} INFO - All retries failed; marking task as FAILED [2020-11-02 09:33:40,835] {{logging_mixin.py:112}} INFO - [2020-11-02 09:33:40,834] {{local_task_job.py:103}} INFO - Task exited with return code 1