-
Notifications
You must be signed in to change notification settings - Fork 28.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-32996][Web-UI][3.0] Handle empty ExecutorMetrics in ExecutorMetricsJsonSerializer #29914
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just noticed that you put ExecutorSummarySuite in core/src/test/java ? Can you move to core/src/test/scala?
Test build #129283 has finished for PR 29914 at commit
|
The change looks good. Pending moving the test to correct path. |
9999bcb
to
f7653f8
Compare
|
@shrutig Thanks for doing that. Can you also open a follow-up PR to move the test to core/src/test/scala in master branch too? |
Test build #129301 has finished for PR 29914 at commit
|
f7653f8
to
6f4f852
Compare
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pending test.
Test build #129317 has finished for PR 29914 at commit
|
retest this please |
Kubernetes integration test starting |
Kubernetes integration test status failure |
Test build #129322 has finished for PR 29914 at commit
|
Thanks! Merging to branch-3.0. |
…tricsJsonSerializer ### What changes were proposed in this pull request? This is a backport PR for branch-3.0. This change was raised to `master` branch in `https://github.com/apache/spark/pull/29872` When `peakMemoryMetrics` in `ExecutorSummary` is `Option.empty`, then the `ExecutorMetricsJsonSerializer#serialize` method does not execute the `jsonGenerator.writeObject` method. This causes the json to be generated with `peakMemoryMetrics` key added to the serialized string, but no corresponding value. This causes an error to be thrown when it is the next key `attributes` turn to be added to the json: `com.fasterxml.jackson.core.JsonGenerationException: Can not write a field name, expecting a value ` ### Why are the changes needed? At the start of the Spark job, if `peakMemoryMetrics` is `Option.empty`, then it causes a `com.fasterxml.jackson.core.JsonGenerationException` to be thrown when we navigate to the Executors tab in Spark UI. Complete stacktrace: > com.fasterxml.jackson.core.JsonGenerationException: Can not write a field name, expecting a value > at com.fasterxml.jackson.core.JsonGenerator._reportError(JsonGenerator.java:2080) > at com.fasterxml.jackson.core.json.WriterBasedJsonGenerator.writeFieldName(WriterBasedJsonGenerator.java:161) > at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:725) > at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:721) > at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) > at com.fasterxml.jackson.databind.ser.std.CollectionSerializer.serializeContents(CollectionSerializer.java:145) > at com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents(IterableSerializerModule.scala:26) > at com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents$(IterableSerializerModule.scala:25) > at com.fasterxml.jackson.module.scala.ser.UnresolvedIterableSerializer.serializeContents(IterableSerializerModule.scala:54) > at com.fasterxml.jackson.module.scala.ser.UnresolvedIterableSerializer.serializeContents(IterableSerializerModule.scala:54) > at com.fasterxml.jackson.databind.ser.std.AsArraySerializerBase.serialize(AsArraySerializerBase.java:250) > at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:480) > at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:319) > at com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:4094) > at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:3404) > at org.apache.spark.ui.exec.ExecutorsPage.allExecutorsDataScript$1(ExecutorsTab.scala:64) > at org.apache.spark.ui.exec.ExecutorsPage.render(ExecutorsTab.scala:76) > at org.apache.spark.ui.WebUI.$anonfun$attachPage$1(WebUI.scala:89) > at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:80) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) > at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:873) > at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623) > at org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95) > at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610) > at org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540) > at org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) > at org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345) > at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) > at org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480) > at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) > at org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247) > at org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) > at org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:753) > at org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220) > at org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) > at org.sparkproject.jetty.server.Server.handle(Server.java:505) > at org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:370) > at org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:267) > at org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305) > at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:103) > at org.sparkproject.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126) > at org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366) > at org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698) > at org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804) > at java.base/java.lang.Thread.run(Thread.java:834) ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Unit test Closes #29914 from shrutig/SPARK-32996-3.0. Authored-by: Shruti Gumma <shruti_gumma@apple.com> Signed-off-by: Liang-Chi Hsieh <viirya@gmail.com>
…tricsJsonSerializer ### What changes were proposed in this pull request? This is a backport PR for branch-3.0. This change was raised to `master` branch in `https://github.com/apache/spark/pull/29872` When `peakMemoryMetrics` in `ExecutorSummary` is `Option.empty`, then the `ExecutorMetricsJsonSerializer#serialize` method does not execute the `jsonGenerator.writeObject` method. This causes the json to be generated with `peakMemoryMetrics` key added to the serialized string, but no corresponding value. This causes an error to be thrown when it is the next key `attributes` turn to be added to the json: `com.fasterxml.jackson.core.JsonGenerationException: Can not write a field name, expecting a value ` ### Why are the changes needed? At the start of the Spark job, if `peakMemoryMetrics` is `Option.empty`, then it causes a `com.fasterxml.jackson.core.JsonGenerationException` to be thrown when we navigate to the Executors tab in Spark UI. Complete stacktrace: > com.fasterxml.jackson.core.JsonGenerationException: Can not write a field name, expecting a value > at com.fasterxml.jackson.core.JsonGenerator._reportError(JsonGenerator.java:2080) > at com.fasterxml.jackson.core.json.WriterBasedJsonGenerator.writeFieldName(WriterBasedJsonGenerator.java:161) > at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:725) > at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:721) > at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) > at com.fasterxml.jackson.databind.ser.std.CollectionSerializer.serializeContents(CollectionSerializer.java:145) > at com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents(IterableSerializerModule.scala:26) > at com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents$(IterableSerializerModule.scala:25) > at com.fasterxml.jackson.module.scala.ser.UnresolvedIterableSerializer.serializeContents(IterableSerializerModule.scala:54) > at com.fasterxml.jackson.module.scala.ser.UnresolvedIterableSerializer.serializeContents(IterableSerializerModule.scala:54) > at com.fasterxml.jackson.databind.ser.std.AsArraySerializerBase.serialize(AsArraySerializerBase.java:250) > at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:480) > at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:319) > at com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:4094) > at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:3404) > at org.apache.spark.ui.exec.ExecutorsPage.allExecutorsDataScript$1(ExecutorsTab.scala:64) > at org.apache.spark.ui.exec.ExecutorsPage.render(ExecutorsTab.scala:76) > at org.apache.spark.ui.WebUI.$anonfun$attachPage$1(WebUI.scala:89) > at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:80) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) > at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:873) > at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623) > at org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95) > at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610) > at org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540) > at org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) > at org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345) > at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) > at org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480) > at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) > at org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247) > at org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) > at org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:753) > at org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220) > at org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) > at org.sparkproject.jetty.server.Server.handle(Server.java:505) > at org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:370) > at org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:267) > at org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305) > at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:103) > at org.sparkproject.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) > at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126) > at org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366) > at org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698) > at org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804) > at java.base/java.lang.Thread.run(Thread.java:834) ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Unit test Closes apache#29914 from shrutig/SPARK-32996-3.0. Authored-by: Shruti Gumma <shruti_gumma@apple.com> Signed-off-by: Liang-Chi Hsieh <viirya@gmail.com>
What changes were proposed in this pull request?
This is a backport PR for branch-3.0. This change was raised to
master
branch inhttps://github.com/apache/spark/pull/29872
When
peakMemoryMetrics
inExecutorSummary
isOption.empty
, then theExecutorMetricsJsonSerializer#serialize
method does not execute thejsonGenerator.writeObject
method. This causes the json to be generated withpeakMemoryMetrics
key added to the serialized string, but no corresponding value.This causes an error to be thrown when it is the next key
attributes
turn to be added to the json:com.fasterxml.jackson.core.JsonGenerationException: Can not write a field name, expecting a value
Why are the changes needed?
At the start of the Spark job, if
peakMemoryMetrics
isOption.empty
, then it causesa
com.fasterxml.jackson.core.JsonGenerationException
to be thrown when we navigate to the Executors tab in Spark UI.Complete stacktrace:
Does this PR introduce any user-facing change?
No
How was this patch tested?
Unit test