-
Notifications
You must be signed in to change notification settings - Fork 28.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-40272][CORE]Support service port custom with range #37721
Conversation
core/src/main/scala/org/apache/spark/internal/config/package.scala
Outdated
Show resolved
Hide resolved
Can one of the admins verify this patch? |
@HyukjinKwon Hi,would you mind to take another look? |
@@ -2429,4 +2429,18 @@ package object config { | |||
.version("3.4.0") | |||
.timeConf(TimeUnit.MILLISECONDS) | |||
.createWithDefaultString("5s") | |||
|
|||
private[spark] val CUSTOM_SERVICE_PORT_ORIGIN = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest defining a port range, such as (min, max], and randomly choosing to try bind within this range.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In some environments, the upper limit of available ports may not be 65535, for example, it may be limited to (40000, 50000]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your reviews, this option was considered and the current design focuses on maximising the reuse of the original logic and minimising the increase in configuration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If need limited to (40000, 50000], the range can be controlled by the conf of spark.port.maxRetries.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If need limited to (40000, 50000], the range can be controlled by the conf of spark.port.maxRetries.
I'm not sure whether using spark.port.maxRetries
to control the port range is a good idea, which looks more like a trick
We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable. |
What changes were proposed in this pull request?
Port range specification is supported by specifying the starting port and configuring the number of port retries.Introduction of a new parameters: spark.service.port.custom.origin
Service like SparkUI.port, SparkDriver.port, NettyService.port etc. The starting port number is spark.service.port.custom.origin and the port range is limited by spark.port.maxRetries. the final port range is: spark.service.port.custom.origin <= port.range <= spark.service.port.custom.origin + spark.port.maxRetries
e.g. spark.service.port.custom.origin=49152 and spark.port.maxRetries=10, the actual range of the port is greater than or equal to 49152 and less than 49163
Why are the changes needed?
In a real production environment, there are some special scenarios where we need to limit the scope of the service port.
Does this PR introduce any user-facing change?
No
How was this patch tested?
new ut