-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
improve the doc for "spark.memory.offHeap.size" #11561
Conversation
I didn't create a JIRA as I think it is not necessary....if the reviewer still prefers with a JIRA, I can add one |
Test build #52564 has finished for PR 11561 at commit
|
@srowen mind reviewing it? |
@@ -769,7 +769,7 @@ Apart from these, the following properties are also available, and may be useful | |||
<td><code>spark.memory.offHeap.size</code></td> | |||
<td>0</td> | |||
<td> | |||
The absolute amount of memory which can be used for off-heap allocation. | |||
The absolute amount of memory (in terms by bytes) which can be used for off-heap allocation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just "... memory in bytes which ..." is sufficient. "terms by bytes" doesn't work. Yes it's fine.
Test build #52577 has finished for PR 11561 at commit
|
LGTM. Merging to master and 1.6. |
The description of "spark.memory.offHeap.size" in the current document does not clearly state that memory is counted with bytes.... This PR contains a small fix for this tiny issue document fix Author: CodingCat <zhunansjtu@gmail.com> Closes #11561 from CodingCat/master. (cherry picked from commit a3ec50a) Signed-off-by: Shixiong Zhu <shixiong@databricks.com>
Thanks, @CodingCat |
The description of "spark.memory.offHeap.size" in the current document does not clearly state that memory is counted with bytes.... This PR contains a small fix for this tiny issue document fix Author: CodingCat <zhunansjtu@gmail.com> Closes apache#11561 from CodingCat/master.
What changes were proposed in this pull request?
The description of "spark.memory.offHeap.size" in the current document does not clearly state that memory is counted with bytes....
This PR contains a small fix for this tiny issue
How was this patch tested?
document fix