New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-34494][SQL][DOCS] Move JSON data source options from Python and Scala into a single page #32204
Conversation
Test build #754872701 for PR 32204 at commit |
Test build #137474 has finished for PR 32204 at commit
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test build #137548 has finished for PR 32204 at commit
|
Kubernetes integration test starting |
Kubernetes integration test status failure |
Kubernetes integration test starting |
Kubernetes integration test status failure |
Test build #137551 has finished for PR 32204 at commit
|
Test build #137573 has finished for PR 32204 at commit
|
Test build #137597 has finished for PR 32204 at commit
|
Test build #138063 has finished for PR 32204 at commit
|
Kubernetes integration test starting |
Kubernetes integration test status failure |
Test build #138344 has finished for PR 32204 at commit
|
Kubernetes integration test starting |
Kubernetes integration test status failure |
|
sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamReader.scala
Outdated
Show resolved
Hide resolved
Kubernetes integration test starting |
Kubernetes integration test status success |
Test build #138741 has finished for PR 32204 at commit
|
Test build #138753 has finished for PR 32204 at commit
|
sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala
Outdated
Show resolved
Hide resolved
Test build #138778 has started for PR 32204 at commit |
Test build #138784 has started for PR 32204 at commit |
Kubernetes integration test starting |
Kubernetes integration test status success |
Merged to master. |
<tr> | ||
<td><code>encoding</code></td> | ||
<td>None</td> | ||
<td>For reading, allows to forcibly set one of standard basic or extended encoding for the JSON files. For example UTF-16BE, UTF-32LE. If None is set, the encoding of input JSON will be detected automatically when the multiLine option is set to <code>true</code>. For writing, Specifies encoding (charset) of saved json files. If None is set, the default UTF-8 charset will be used.</td> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also fix the docs properly from None
to something else. That only applies to Python side.
* `DataFrameReader` | ||
* `DataFrameWriter` | ||
* `DataStreamReader` | ||
* `DataStreamWriter` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add JSON functions here too
* `DataFrameReader` | ||
* `DataFrameWriter` | ||
* `DataStreamReader` | ||
* `DataStreamWriter` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also mention:
* `OPTIONS` clause at [CREATE TABLE USING DATA_SOURCE](sql-ref-syntax-ddl-create-table-datasource.html)
What changes were proposed in this pull request?
This PR proposes move JSON data source options from Python, Scala and Java into a single page.
Why are the changes needed?
So far, the documentation for JSON data source options is separated into different pages for each language API documents. However, this makes managing many options inconvenient, so it is efficient to manage all options in a single page and provide a link to that page in the API of each language.
Does this PR introduce any user-facing change?
Yes, the documents will be shown below after this change:
How was this patch tested?
Manually build docs and confirm the page.