Skip to content

Commit

Permalink
[SPARK-35645][PYTHON][DOCS] Merge contents and remove obsolete pages …
Browse files Browse the repository at this point in the history
…in Getting Started section

### What changes were proposed in this pull request?

This PR revise the installation to describe `pip install pyspark[pandas_on_spark]` and removes pandas-on-Spark installation and videos/blogposts.

### Why are the changes needed?

pandas-on-Spark installation is merged to PySpark installation pages. For videos/blogposts, now this is named pandas API on Spark. Old Koalas blogposts and videos are obsolete.

### Does this PR introduce _any_ user-facing change?

To end users, no because the docs are not released yet.

### How was this patch tested?

I manually built the docs and checked the output

Closes #33018 from HyukjinKwon/SPARK-35645.

Authored-by: Hyukjin Kwon <gurwls223@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
  • Loading branch information
HyukjinKwon authored and dongjoon-hyun committed Jun 22, 2021
1 parent ce53b71 commit 2704658
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 283 deletions.
10 changes: 2 additions & 8 deletions python/docs/source/getting_started/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,18 +25,12 @@ There are more guides shared with other languages such as
`Quick Start <https://spark.apache.org/docs/latest/quick-start.html>`_ in Programming Guides
at `the Spark documentation <https://spark.apache.org/docs/latest/index.html#where-to-go-from-here>`_.

.. TODO(SPARK-35588): Merge PySpark quickstart and 10 minutes to pandas API on Spark.
.. toctree::
:maxdepth: 2

install
quickstart

For pandas API on Spark:

.. toctree::
:maxdepth: 2

ps_install
ps_10mins
ps_videos_blogs

3 changes: 3 additions & 0 deletions python/docs/source/getting_started/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,10 @@ If you want to install extra dependencies for a specific component, you can inst

.. code-block:: bash
# Spark SQL
pip install pyspark[sql]
# pandas API on Spark
pip install pyspark[pandas_on_spark]
For PySpark with/without a specific Hadoop version, you can install it by using ``PYSPARK_HADOOP_VERSION`` environment variables as below:

Expand Down
145 changes: 0 additions & 145 deletions python/docs/source/getting_started/ps_install.rst

This file was deleted.

130 changes: 0 additions & 130 deletions python/docs/source/getting_started/ps_videos_blogs.rst

This file was deleted.

0 comments on commit 2704658

Please sign in to comment.