You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
2020-07-15 10:41:44,762 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-07-15 10:41:49,484 INFO yarn.Client: Requesting a new application from cluster with 2 NodeManagers
2020-07-15 10:41:49,769 INFO conf.Configuration: resource-types.xml not found
2020-07-15 10:41:49,770 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2020-07-15 10:41:49,858 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2048 MB per container)
2020-07-15 10:41:49,858 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
2020-07-15 10:41:49,859 INFO yarn.Client: Setting up container launch context for our AM
2020-07-15 10:41:49,859 INFO yarn.Client: Setting up the launch environment for our AM container
2020-07-15 10:41:49,904 INFO yarn.Client: Preparing resources for our AM container
2020-07-15 10:41:54,461 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
2020-07-15 10:41:59,881 INFO yarn.Client: Uploading resource file:/tmp/spark-03670c1e-1517-499a-8b71-1a13396a2a3f/__spark_libs__4190781879524216202.zip -> hdfs://ns/user/root/.sparkStaging/application_1594629977068_0016/__spark_libs__4190781879524216202.zip
2020-07-15 10:42:00,448 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-15 10:42:03,778 INFO yarn.Client: Uploading resource file:/root/wormhole-0.6.3/app/wormhole-ums_1.3-sparkx_2.2-0.6.3-jar-with-dependencies.jar -> hdfs://ns/user/root/.sparkStaging/application_1594629977068_0016/wormhole-ums_1.3-sparkx_2.2-0.6.3-jar-with-dependencies.jar
2020-07-15 10:42:04,111 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-15 10:42:07,460 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-15 10:42:09,820 INFO yarn.Client: Uploading resource file:/root/wormhole-0.6.3/conf/sparkx.log4j.properties -> hdfs://ns/user/root/.sparkStaging/application_1594629977068_0016/sparkx.log4j.properties
2020-07-15 10:42:09,880 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-15 10:42:10,447 INFO yarn.Client: Uploading resource file:/tmp/spark-03670c1e-1517-499a-8b71-1a13396a2a3f/__spark_conf__6998132505409782354.zip -> hdfs://ns/user/root/.sparkStaging/application_1594629977068_0016/__spark_conf__.zip
2020-07-15 10:42:10,500 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-15 10:42:11,129 INFO spark.SecurityManager: Changing view acls to: root
2020-07-15 10:42:11,131 INFO spark.SecurityManager: Changing modify acls to: root
2020-07-15 10:42:11,139 INFO spark.SecurityManager: Changing view acls groups to:
2020-07-15 10:42:11,141 INFO spark.SecurityManager: Changing modify acls groups to:
2020-07-15 10:42:11,143 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
2020-07-15 10:42:11,194 INFO yarn.Client: Submitting application application_1594629977068_0016 to ResourceManager
2020-07-15 10:42:11,421 INFO impl.YarnClientImpl: Submitted application application_1594629977068_0016
2020-07-15 10:42:11,424 INFO yarn.Client: Application report for application_1594629977068_0016 (state: ACCEPTED)
2020-07-15 10:42:11,438 INFO yarn.Client:
client token: N/A
diagnostics: [星期三 七月 15 10:42:11 +0800 2020] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:4096, vCores:2> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; Queue's capacity (absolute resource) = <memory:4096, vCores:2> ; Queue's used capacity (absolute resource) = <memory:0, vCores:0> ; Queue's max capacity (absolute resource) = <memory:4096, vCores:2> ;
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1594780931303
final status: UNDEFINED
tracking URL: http://hadoop01:8088/proxy/application_1594629977068_0016/
user: root
2020-07-15 10:42:11,517 INFO util.ShutdownHookManager: Shutdown hook called
2020-07-15 10:42:11,521 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-03670c1e-1517-499a-8b71-1a13396a2a3f
有人知道这个问题怎么解决吗,谢谢
The text was updated successfully, but these errors were encountered:
根据日志来看,spark已被shutdown,但是状态没有反映。
附:Stream日志
有人知道这个问题怎么解决吗,谢谢
The text was updated successfully, but these errors were encountered: