-
Notifications
You must be signed in to change notification settings - Fork 981
[KYUUBI #1816] Implement KyuubiHistoryServerPlugin #1820
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1820 +/- ##
============================================
+ Coverage 59.46% 59.56% +0.09%
+ Complexity 272 268 -4
============================================
Files 277 284 +7
Lines 13853 14027 +174
Branches 1776 1784 +8
============================================
+ Hits 8238 8355 +117
- Misses 4911 4959 +48
- Partials 704 713 +9
Continue to review full report at Codecov.
|
|
Hi @yaooqinn , this implementation seems feasible. Can you help me review it? |
|
thanks, @wForget for the great job, will take a look tomorrow |
|
also cc @turboFei @ulysses-you |
|
|
||
| def endTime(): Long = engine.map(_ => System.currentTimeMillis()).getOrElse { | ||
| sparkUI | ||
| .map(ui => ui.store.applicationInfo().attempts.head.endTime.getTime) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why the start/end Time always use the first attempt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that the spark source code also uses attempts.head to get attempt.
like: https://github.com/apache/spark/blob/2a9416abd5c8d83901c09a08433bf91e649dc43f/core/src/main/scala/org/apache/spark/ui/jobs/AllJobsPage.scala#L277
| * The number of SQL client sessions kept in the Kyuubi Query Engine web UI. | ||
| */ | ||
| private val retainedSessions: Int = { | ||
| sparkConf.getOption(SQLConf.THRIFTSERVER_UI_SESSION_LIMIT.key) match { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we simply use kyuubiConf?
|
|
||
| class KyuubiHistoryServerPlugin extends AppHistoryServerPlugin { | ||
|
|
||
| def kyuubiConf: KyuubiConf = KyuubiConf() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
private?
|
|
||
| class KyuubiHistoryServerPlugin extends AppHistoryServerPlugin { | ||
|
|
||
| def kyuubiConf: KyuubiConf = KyuubiConf() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we also load kyuubi configurations from sparkConf like SparkSQLEngine?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good, I will load the kyuubi configurations from sparkConf and only use kyuubiConf.
sorry for the late reply, plans were disrupted by omicron in HZ. |
| import org.apache.kyuubi.config.KyuubiConf | ||
| import org.apache.kyuubi.engine.spark.events.EngineEventsStore | ||
|
|
||
| class KyuubiHistoryServerPlugin extends AppHistoryServerPlugin { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add some comments about how to use this class ? It should work with history server together.
| } | ||
|
|
||
| override def setupUI(ui: SparkUI): Unit = { | ||
| addKyuubiConfFromSpark(ui.conf) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it looks like we should build kyuubi conf for every Spark app UI instead of use a global kyuubi conf.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it looks like we should build kyuubi conf for every Spark app UI instead of use a global kyuubi conf.
These sparkConf seem to be the same object, should we create multiple kyuubiConf?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we need to get the configuration of different spark app, we can get it using org.apache.spark.status.AppStatusStore#environmentInfo, do we need to do that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These sparkConf seem to be the same object, should we create multiple kyuubiConf?
I just check the Spark build-in plugin, do not see a global value. The code is confused if the input conf is always same. It's better to craete multiple kyuubiConf.
If we need to get the configuration of different spark app, we can get it using org.apache.spark.status.AppStatusStore#environmentInfo, do we need to do that?
It seems trivial, we dont need to update.
| </ul> | ||
| } | ||
| private def generateBasicStats(): Seq[Node] = | ||
| if (parent.engine != null) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the type of parent.engine is Option ?
|
thanks, merging to master |
|
Hi, do we have a documentation for how to use this extension yet? |
There is usage in comment of Lines 29 to 34 in c8e6457
|
|
Just copy |
Why are the changes needed?
Implement KyuubiHistoryServerPlugin. #1816
How was this patch tested?
Add some test cases that check the changes thoroughly including negative and positive cases if possible
Add screenshots for manual tests if appropriate


SparkUI:
SparkHistoryServer:

