New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-33363] Add prompt information related to the current task when pyspark/sparkR starts #30266
Conversation
ok to test |
Test build #130684 has finished for PR 30266 at commit
|
Kubernetes integration test starting |
Kubernetes integration test status failure |
Kubernetes integration test starting |
Test build #130706 has finished for PR 30266 at commit
|
Kubernetes integration test status failure |
Test build #130716 has finished for PR 30266 at commit
|
Test build #130715 has finished for PR 30266 at commit
|
Test build #130718 has finished for PR 30266 at commit
|
Test build #130719 has finished for PR 30266 at commit
|
Kubernetes integration test starting |
Kubernetes integration test starting |
Test build #130721 has finished for PR 30266 at commit
|
Kubernetes integration test status failure |
Kubernetes integration test status success |
Kubernetes integration test starting |
Kubernetes integration test status success |
Kubernetes integration test starting |
Kubernetes integration test status failure |
@HyukjinKwon hi,please help me to review this when you have time, thanks~. |
R/pkg/inst/profile/shell.R
Outdated
@@ -43,5 +43,8 @@ | |||
cat(" /_/", "\n") | |||
cat("\n") | |||
|
|||
cat("\nSpark context Web UI available at", SparkR::sparkR.uiWebUrl()) | |||
cat("\nSpark context available as 'sc' (master = ", unlist(SparkR::sparkR.conf("spark.master")), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's remove this line. Spark context is a private API in SparkR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Namely something like:
cat("\nSparkSession available as 'spark' (master = ", unlist(SparkR::sparkR.conf("spark.master")), ", app id = ", unlist(SparkR::sparkR.conf("spark.app.id")), ").", sep = "")
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good otherwise.
Kubernetes integration test starting |
Test build #130785 has finished for PR 30266 at commit
|
Kubernetes integration test status success |
Kubernetes integration test starting |
Test build #130793 has finished for PR 30266 at commit
|
Kubernetes integration test status success |
@HyukjinKwon done this, please help to review it again, thanks a lot. |
Merged to master. |
What changes were proposed in this pull request?
add prompt information about current applicationId, current URL and master info when pyspark / sparkR starts.
Why are the changes needed?
The information printed when pyspark/sparkR starts does not prompt the basic information of current application, and it is not convenient when used pyspark/sparkR in dos.
Does this PR introduce any user-facing change?
no
How was this patch tested?
manual test result shows below: