-
Notifications
You must be signed in to change notification settings - Fork 476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HDDS-3365. Ensure OzoneConfiguration is initialized in OzoneClientFac… #798
Conversation
…tory#getOzoneClient.
Looks like the it-client test timeout even though all the test passed. Some of the tests that are most time consuming are highlighted below. Have we considered disabling before fixing them? cc: @elek
[INFO] ------------------------------------------------------- |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 LGTM.
Thank You @xiaoyuyao for the contribution. |
…tory#getOzoneClient.
What's changed?
Change to use OzoneConfiguration when checking the OM HA related configurations as some of these APIs may get invoked with Hadoop Configuration.
What is the link to the Apache JIRA
https://issues.apache.org/jira/browse/HDDS-3365
How was this patch tested?
Test with the Spark wordcount job, before the patch, RM failed due to om configuration is not available from Hadoop Configuration.
Before
20/04/09 00:40:02 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.hadoop.yarn.exceptions.YarnException: Failed to submit application_1586392793624_0001 to YARN : Failed to renew token: Kind: OzoneToken, Service: 172.27.129.0:9862,172.27.11.66:9862,172.27.20.1:9862, Ident: (OzoneToken owner=hrt_qa@ROOT.HWX.SITE, renewer=yarn, realUser=, issueDate=1586392799156, maxDate=1586997599156, sequenceNumber=65, masterKeyId=1, strToSign=null, signature=null, awsAccessKeyId=null, omServiceId=ozone1)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.submitApplication(YarnClientImpl.java:322)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:185)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:60)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:185)
at org.apache.spark.SparkContext.(SparkContext.scala:505)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:834)
After
...
20/04/09 04:09:25 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 6.0 (TID 16) in 94 ms on quasar-kekkuz-8.quasar-kekkuz.root.hwx.site (executor 1) (2/4)
20/04/09 04:09:25 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 6.0 (TID 18) in 64 ms on quasar-kekkuz-6.quasar-kekkuz.root.hwx.site (executor 2) (3/4)
20/04/09 04:09:25 INFO scheduler.TaskSetManager: Finished task 3.0 in stage 6.0 (TID 19) in 66 ms on quasar-kekkuz-8.quasar-kekkuz.root.hwx.site (executor 1) (4/4)
20/04/09 04:09:25 INFO cluster.YarnScheduler: Removed TaskSet 6.0, whose tasks have all completed, from pool
20/04/09 04:09:25 INFO scheduler.DAGScheduler: ResultStage 6 (collect at /tmp/spark_script.py:40) finished in 0.172 s
20/04/09 04:09:25 INFO scheduler.DAGScheduler: Job 2 finished: collect at /tmp/spark_script.py:40, took 0.781710 s
**** ,357
**** felis,197
**** dictum,142
**** semper,160
**** justo,217
**** purus,215
**** ante,247
...