Skip to content

Commit

Permalink
[SPARK-33256][PYTHON][DOCS] Clarify PySpark follows NumPy documentati…
Browse files Browse the repository at this point in the history
…on style

### What changes were proposed in this pull request?

This PR adds few lines about docstring style to document that PySpark follows [NumPy documentation style](https://numpydoc.readthedocs.io/en/latest/format.html). We all completed the migration to NumPy documentation style at SPARK-32085.

Ideally we should have a page like https://pandas.pydata.org/docs/development/contributing_docstring.html but I would like to leave it as a future work.

### Why are the changes needed?

To tell developers that PySpark now follows NumPy documentation style.

### Does this PR introduce _any_ user-facing change?

No, it's a change in unreleased branches yet.

### How was this patch tested?

Manually tested via `make clean html` under `python/docs`:

![Screen Shot 2020-12-06 at 1 34 50 PM](https://user-images.githubusercontent.com/6477701/101271623-d5ce0380-37c7-11eb-93ac-da73caa50c37.png)

Closes #30622 from HyukjinKwon/SPARK-33256.

Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
  • Loading branch information
HyukjinKwon authored and dongjoon-hyun committed Dec 6, 2020
1 parent e857e06 commit 5250841
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions python/docs/source/development/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -123,11 +123,12 @@ Annotations can be validated using ``dev/lint-python`` script or by invoking myp
Code Style Guide
----------------
Code and Docstring Guide
----------------------------------

Please follow the style of the existing codebase as is, which is virtually PEP 8 with one exception: lines can be up
to 100 characters in length, not 79.
For the docstring style, PySpark follows `NumPy documentation style <https://numpydoc.readthedocs.io/en/latest/format.html>`_.

Note that the method and variable names in PySpark are the similar case is ``threading`` library in Python itself where
the APIs were inspired by Java. PySpark also follows `camelCase` for exposed APIs that match with Scala and Java.
Expand Down

0 comments on commit 5250841

Please sign in to comment.