This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 0248b32 [SPARK-31382][BUILD] Show a better error message for
different python and pip installation mistake
0248b32 is described below
commit 0248b329726527a07d688122f56dd2ada0e51337
Author: HyukjinKwon <[email protected]>
AuthorDate: Thu Apr 9 11:04:35 2020 +0900
[SPARK-31382][BUILD] Show a better error message for different python and
pip installation mistake
### What changes were proposed in this pull request?
This PR proposes to show a better error message when a user mistakenly
installs `pyspark` from PIP but the default `python` does not point out the
corresponding `pip`. See
https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560
as an example.
It can be reproduced as below:
I have two Python executables. `python` is Python 3.7, `pip` binds with
Python 3.7 and `python2.7` is Python 2.7.
```bash
pip install pyspark
```
```bash
pyspark
```
```
...
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.4.5
/_/
Using Python version 3.7.3 (default, Mar 27 2019 09:23:15)
SparkSession available as 'spark'.
...
```
```bash
PYSPARK_PYTHON=python2.7 pyspark
```
```
Could not find valid SPARK_HOME while searching ['/Users',
'/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark:
line 24: /bin/load-spark-env.sh: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark:
line 77: /bin/spark-submit: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark:
line 77: exec: /bin/spark-submit: cannot execute: No such file or directory
```
### Why are the changes needed?
There are multiple questions outside about this error and they have no idea
what's going on. See:
-
https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560
-
https://stackoverflow.com/questions/45991888/path-issue-could-not-find-valid-spark-home-while-searching
-
https://stackoverflow.com/questions/49707239/pyspark-could-not-find-valid-spark-home
-
https://stackoverflow.com/questions/55569985/pyspark-could-not-find-valid-spark-home
-
https://stackoverflow.com/questions/48296474/error-could-not-find-valid-spark-home-while-searching-pycharm-in-windows
- https://github.com/ContinuumIO/anaconda-issues/issues/8076
The answer is usually setting `SPARK_HOME`; however this isn't completely
correct.
It works if you set `SPARK_HOME` because `pyspark` executable script
directly imports the library by using `SPARK_HOME` (see
https://github.com/apache/spark/blob/master/bin/pyspark#L52-L53) instead of the
default package location specified via `python` executable. So, this way you
use a package installed in a different Python, which isn't ideal.
### Does this PR introduce any user-facing change?
Yes, it fixes the error message better.
**Before:**
```
Could not find valid SPARK_HOME while searching ['/Users',
'/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
...
```
**After:**
```
Could not find valid SPARK_HOME while searching ['/Users',
'/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
Did you install PySpark via a package manager such as pip or Conda? If so,
PySpark was not found in your Python environment. It is possible your
Python environment does not properly bind with your package manager.
Please check your default 'python' and if you set PYSPARK_PYTHON and/or
PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
PySpark, for example, 'python -c 'import pyspark'.
If you cannot import, you can install by using the Python executable
directly,
for example, 'python -m pip install pyspark [--user]'. Otherwise, you can
also
explicitly set the Python executable, that has PySpark installed, to
PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
'PYSPARK_PYTHON=python3 pyspark'.
...
```
### How was this patch tested?
Manually tested as described above.
Closes #28152 from HyukjinKwon/SPARK-31382.
Authored-by: HyukjinKwon <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
---
python/pyspark/find_spark_home.py | 18 ++++++++++++++++--
1 file changed, 16 insertions(+), 2 deletions(-)
diff --git a/python/pyspark/find_spark_home.py
b/python/pyspark/find_spark_home.py
index 9c4ed46..52f6ea9 100755
--- a/python/pyspark/find_spark_home.py
+++ b/python/pyspark/find_spark_home.py
@@ -40,6 +40,7 @@ def _find_spark_home():
paths = ["../", os.path.dirname(os.path.realpath(__file__))]
# Add the path of the PySpark module if it exists
+ import_error_raised = False
if sys.version < "3":
import imp
try:
@@ -49,7 +50,7 @@ def _find_spark_home():
paths.append(os.path.join(module_home, "../../"))
except ImportError:
# Not pip installed no worries
- pass
+ import_error_raised = True
else:
from importlib.util import find_spec
try:
@@ -59,7 +60,7 @@ def _find_spark_home():
paths.append(os.path.join(module_home, "../../"))
except ImportError:
# Not pip installed no worries
- pass
+ import_error_raised = True
# Normalize the paths
paths = [os.path.abspath(p) for p in paths]
@@ -68,6 +69,19 @@ def _find_spark_home():
return next(path for path in paths if is_spark_home(path))
except StopIteration:
print("Could not find valid SPARK_HOME while searching
{0}".format(paths), file=sys.stderr)
+ if import_error_raised:
+ print(
+ "\nDid you install PySpark via a package manager such as pip
or Conda? If so,\n"
+ "PySpark was not found in your Python environment. It is
possible your\n"
+ "Python environment does not properly bind with your package
manager.\n"
+ "\nPlease check your default 'python' and if you set
PYSPARK_PYTHON and/or\n"
+ "PYSPARK_DRIVER_PYTHON environment variables, and see if you
can import\n"
+ "PySpark, for example, 'python -c 'import pyspark'.\n"
+ "\nIf you cannot import, you can install by using the Python
executable directly,\n"
+ "for example, 'python -m pip install pyspark [--user]'.
Otherwise, you can also\n"
+ "explicitly set the Python executable, that has PySpark
installed, to\n"
+ "PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment
variables, for example,\n"
+ "'PYSPARK_PYTHON=python3 pyspark'.\n", file=sys.stderr)
sys.exit(-1)
if __name__ == "__main__":
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]