Skip to content

Commit

Permalink
[SPARK-31382][BUILD] Show a better error message for different python…
Browse files Browse the repository at this point in the history
… and pip installation mistake

### What changes were proposed in this pull request?

This PR proposes to show a better error message when a user mistakenly installs `pyspark` from PIP but the default `python` does not point out the corresponding `pip`. See https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560 as an example.

It can be reproduced as below:

I have two Python executables. `python` is Python 3.7, `pip` binds with Python 3.7 and `python2.7` is Python 2.7.

```bash
pip install pyspark
```

```bash
pyspark
```

```
...
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 2.4.5
      /_/

Using Python version 3.7.3 (default, Mar 27 2019 09:23:15)
SparkSession available as 'spark'.
...
```

```bash
PYSPARK_PYTHON=python2.7 pyspark
```

```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 24: /bin/load-spark-env.sh: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 77: /bin/spark-submit: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 77: exec: /bin/spark-submit: cannot execute: No such file or directory
```

### Why are the changes needed?

There are multiple questions outside about this error and they have no idea what's going on. See:

- https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560
- https://stackoverflow.com/questions/45991888/path-issue-could-not-find-valid-spark-home-while-searching
- https://stackoverflow.com/questions/49707239/pyspark-could-not-find-valid-spark-home
- https://stackoverflow.com/questions/55569985/pyspark-could-not-find-valid-spark-home
- https://stackoverflow.com/questions/48296474/error-could-not-find-valid-spark-home-while-searching-pycharm-in-windows
- ContinuumIO/anaconda-issues#8076

The answer is usually setting `SPARK_HOME`; however this isn't completely correct.

It works if you set `SPARK_HOME` because `pyspark` executable script directly imports the library by using `SPARK_HOME` (see https://github.com/apache/spark/blob/master/bin/pyspark#L52-L53) instead of the default package location specified via `python` executable. So, this way you use a package installed in a different Python, which isn't ideal.

### Does this PR introduce any user-facing change?

Yes, it fixes the error message better.

**Before:**

```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
...
```

**After:**

```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']

Did you install PySpark via a package manager such as pip or Conda? If so,
PySpark was not found in your Python environment. It is possible your
Python environment does not properly bind with your package manager.

Please check your default 'python' and if you set PYSPARK_PYTHON and/or
PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
PySpark, for example, 'python -c 'import pyspark'.

If you cannot import, you can install by using the Python executable directly,
for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
explicitly set the Python executable, that has PySpark installed, to
PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
'PYSPARK_PYTHON=python3 pyspark'.
...
```

### How was this patch tested?

Manually tested as described above.

Closes #28152 from HyukjinKwon/SPARK-31382.

Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
(cherry picked from commit 0248b32)
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
  • Loading branch information
HyukjinKwon committed Apr 9, 2020
1 parent 2221d3e commit 86e8dbf
Showing 1 changed file with 16 additions and 2 deletions.
18 changes: 16 additions & 2 deletions python/pyspark/find_spark_home.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ def is_spark_home(path):
paths = ["../", os.path.dirname(os.path.realpath(__file__))]

# Add the path of the PySpark module if it exists
import_error_raised = False
if sys.version < "3":
import imp
try:
Expand All @@ -49,7 +50,7 @@ def is_spark_home(path):
paths.append(os.path.join(module_home, "../../"))
except ImportError:
# Not pip installed no worries
pass
import_error_raised = True
else:
from importlib.util import find_spec
try:
Expand All @@ -59,7 +60,7 @@ def is_spark_home(path):
paths.append(os.path.join(module_home, "../../"))
except ImportError:
# Not pip installed no worries
pass
import_error_raised = True

# Normalize the paths
paths = [os.path.abspath(p) for p in paths]
Expand All @@ -68,6 +69,19 @@ def is_spark_home(path):
return next(path for path in paths if is_spark_home(path))
except StopIteration:
print("Could not find valid SPARK_HOME while searching {0}".format(paths), file=sys.stderr)
if import_error_raised:
print(
"\nDid you install PySpark via a package manager such as pip or Conda? If so,\n"
"PySpark was not found in your Python environment. It is possible your\n"
"Python environment does not properly bind with your package manager.\n"
"\nPlease check your default 'python' and if you set PYSPARK_PYTHON and/or\n"
"PYSPARK_DRIVER_PYTHON environment variables, and see if you can import\n"
"PySpark, for example, 'python -c 'import pyspark'.\n"
"\nIf you cannot import, you can install by using the Python executable directly,\n"
"for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also\n"
"explicitly set the Python executable, that has PySpark installed, to\n"
"PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,\n"
"'PYSPARK_PYTHON=python3 pyspark'.\n", file=sys.stderr)
sys.exit(-1)

if __name__ == "__main__":
Expand Down

0 comments on commit 86e8dbf

Please sign in to comment.