This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8e36a8f  [SPARK-32419][PYTHON][BUILD] Avoid using subshell for Conda 
env (de)activation in pip packaging test
8e36a8f is described below

commit 8e36a8f33fb9d20c481e9eeb0ea8155aa1569439
Author: HyukjinKwon <[email protected]>
AuthorDate: Sat Jul 25 13:09:23 2020 +0900

    [SPARK-32419][PYTHON][BUILD] Avoid using subshell for Conda env 
(de)activation in pip packaging test
    
    ### What changes were proposed in this pull request?
    
    This PR proposes to avoid using subshell when it activates Conda 
environment. Looks like it ends up with activating the env within the subshell 
even if you use `conda` command.
    
    ### Why are the changes needed?
    
    If you take a close look for GitHub Actions log:
    
    ```
     Installing dist into virtual env
    Processing ./python/dist/pyspark-3.1.0.dev0.tar.gz
    Collecting py4j==0.10.9
     Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
    Using legacy setup.py install for pyspark, since package 'wheel' is not 
installed.
    Installing collected packages: py4j, pyspark
     Running setup.py install for pyspark: started
     Running setup.py install for pyspark: finished with status 'done'
    Successfully installed py4j-0.10.9 pyspark-3.1.0.dev0
    
    ...
    
    Installing dist into virtual env
    Obtaining file:///home/runner/work/spark/spark/python
    Collecting py4j==0.10.9
     Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
    Installing collected packages: py4j, pyspark
     Attempting uninstall: py4j
     Found existing installation: py4j 0.10.9
     Uninstalling py4j-0.10.9:
     Successfully uninstalled py4j-0.10.9
     Attempting uninstall: pyspark
     Found existing installation: pyspark 3.1.0.dev0
     Uninstalling pyspark-3.1.0.dev0:
     Successfully uninstalled pyspark-3.1.0.dev0
     Running setup.py develop for pyspark
    Successfully installed py4j-0.10.9 pyspark
    ```
    
    It looks not properly using Conda as it removes the previously installed 
one when it reinstalls again.
    We should ideally test it with Conda environment as it's intended.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, dev-only.
    
    ### How was this patch tested?
    
    GitHub Actions will test. I also manually tested in my local.
    
    Closes #29212 from HyukjinKwon/SPARK-32419.
    
    Authored-by: HyukjinKwon <[email protected]>
    Signed-off-by: HyukjinKwon <[email protected]>
---
 dev/run-pip-tests | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/dev/run-pip-tests b/dev/run-pip-tests
index be96ed9..b322d3f 100755
--- a/dev/run-pip-tests
+++ b/dev/run-pip-tests
@@ -85,7 +85,7 @@ for python in "${PYTHON_EXECS[@]}"; do
         source "$CONDA_PREFIX/etc/profile.d/conda.sh"
       fi
       conda create -y -p "$VIRTUALENV_PATH" python=$python numpy pandas pip 
setuptools
-      source activate "$VIRTUALENV_PATH" || (echo "Falling back to 'conda 
activate'" && conda activate "$VIRTUALENV_PATH")
+      source activate "$VIRTUALENV_PATH" || conda activate "$VIRTUALENV_PATH"
     else
       mkdir -p "$VIRTUALENV_PATH"
       virtualenv --python=$python "$VIRTUALENV_PATH"
@@ -128,7 +128,7 @@ for python in "${PYTHON_EXECS[@]}"; do
 
     # conda / virtualenv environments need to be deactivated differently
     if [ -n "$USE_CONDA" ]; then
-      source deactivate || (echo "Falling back to 'conda deactivate'" && conda 
deactivate)
+      source deactivate || conda deactivate
     else
       deactivate
     fi


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to