This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new bd29569c5ddd [SPARK-46495][INFRA] Merge pyspark-error to pyspark-core
bd29569c5ddd is described below
commit bd29569c5ddd494555c5d55c8a1719ec6b40f7e1
Author: Ruifeng Zheng <[email protected]>
AuthorDate: Sun Dec 24 14:24:09 2023 -0800
[SPARK-46495][INFRA] Merge pyspark-error to pyspark-core
### What changes were proposed in this pull request?
Merge `pyspark-error` to `pyspark-core`
### Why are the changes needed?
`pyspark-error` (as well as the python packaging test) is pretty fast, it
was initially factored out to fix the disk space issue.
Now we can merge it back to other tests, to save one test job.
### Does this PR introduce _any_ user-facing change?
no, infra-only
### How was this patch tested?
ci
manually check the merged job
https://github.com/zhengruifeng/spark/actions/runs/7312132932/job/19922676199
### Was this patch authored or co-authored using generative AI tooling?
no
Closes #44470 from zhengruifeng/infra_py_error.
Authored-by: Ruifeng Zheng <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.github/workflows/build_and_test.yml | 8 +++-----
1 file changed, 3 insertions(+), 5 deletions(-)
diff --git a/.github/workflows/build_and_test.yml
b/.github/workflows/build_and_test.yml
index 7b199a2456bc..2385da332002 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -345,12 +345,10 @@ jobs:
java:
- ${{ inputs.java }}
modules:
- - >-
- pyspark-errors
- >-
pyspark-sql, pyspark-resource, pyspark-testing
- >-
- pyspark-core, pyspark-streaming
+ pyspark-core, pyspark-errors, pyspark-streaming
- >-
pyspark-mllib, pyspark-ml, pyspark-ml-connect
- >-
@@ -436,7 +434,7 @@ jobs:
python3.9 -m pip list
pypy3 -m pip list
- name: Install Conda for pip packaging test
- if: ${{ matrix.modules == 'pyspark-errors' }}
+ if: contains(matrix.modules, 'pyspark-errors')
run: |
curl -s
https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh >
miniconda.sh
bash miniconda.sh -b -p $HOME/miniconda
@@ -446,7 +444,7 @@ jobs:
env: ${{ fromJSON(inputs.envs) }}
shell: 'script -q -e -c "bash {0}"'
run: |
- if [[ "$MODULES_TO_TEST" == "pyspark-errors" ]]; then
+ if [[ "$MODULES_TO_TEST" == *"pyspark-errors"* ]]; then
export PATH=$PATH:$HOME/miniconda/bin
export SKIP_PACKAGING=false
echo "Python Packaging Tests Enabled!"
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]