[SPARK-15857]Add caller context in Spark: invoke YARN/HDFS API to set…#14312
Closed
weiqingy wants to merge 1 commit intoapache:masterfrom
weiqingy:master
Closed
[SPARK-15857]Add caller context in Spark: invoke YARN/HDFS API to set…#14312weiqingy wants to merge 1 commit intoapache:masterfrom weiqingy:master
weiqingy wants to merge 1 commit intoapache:masterfrom
weiqingy:master
Conversation
… up caller context
|
Can one of the admins verify this patch? |
| val callerContext = Utils.classForName("org.apache.hadoop.ipc.CallerContext") | ||
| callerContext.getMethod("setCurrent", callerContext).invoke(null, ret) | ||
| } | ||
| catch { |
Contributor
There was a problem hiding this comment.
nit: catch should follow the above }.
Contributor
This may not be so useful, I think we could get app name form yarn through many different ways, simply printing one line log to RM is not so useful. |
Contributor
Author
|
Thanks the feedback, Jerry. I am going to update the patch. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
Pass 'jobId' to Task.
Add a new function 'setCallerContext' in Utils. 'setCallerContext' function will call APIs of 'org.apache.hadoop.ipc.CallerContext' to set up spark caller contexts, which will be written into HDFS hdfs-audit.log or Yarn resource manager log.
'setCallerContext' function will be called in Yarn client, ApplicationMaster, and Task class.
The Spark caller context written into HDFS log will be "JobID_stageID_stageAttemptId_taskID_attemptNumbe on Spark", and the Spark caller context written into Yarn log will be"{spark.app.name} running on Spark".
How was this patch tested?
Manual Tests against some Spark applications in Yarn client mode and cluster mode. Need to check if spark caller contexts were written into HDFS hdfs-audit.log and Yarn resource manager log successfully.
For example, run SparkKmeans on Spark:
In Yarn resource manager log, there will be a record with the spark caller context.
...
2016-07-21 13:36:26,318 INFO org.apache.hadoop.yarn.server.resourcemanager.RMAuditLogger: USER=wyang IP=127.0.0.1 OPERATION=Submit Application Request TARGET=ClientRMService RESULT=SUCCESS APPID=application_1469125587135_0004 CALLERCONTEXT=SparkKMeans running on Spark
...
In HDFS hdfs-audit.log, there will be records with spark caller contexts.
...
2016-07-21 13:38:30,799 INFO FSNamesystem.audit: allowed=true ugi=wyang (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/lr_big.txt/_spark_metadata dst=null perm=null proto=rpc callerContext=SparkKMeans running on Spark
...
2016-07-21 13:39:35,584 INFO FSNamesystem.audit: allowed=true ugi=wyang (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/lr_big.txt dst=null perm=null proto=rpc callerContext=JobId_0_StageID_0_stageAttemptId_0_taskID_1_attemptNumber_0 on Spark
...
If the hadoop version on which Spark runs does not have CallerContext APIs, there will be no information of Spark caller context in those logs.
… up caller context