This is an automated email from the ASF dual-hosted git repository.

ruifengz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new cebd4abe40bd [SPARK-41916][FOLLOW-UP] Restore python linter
cebd4abe40bd is described below

commit cebd4abe40bdf677e08fa7d6d2b7c2f9762d51c3
Author: Ruifeng Zheng <[email protected]>
AuthorDate: Wed Dec 17 19:55:21 2025 +0800

    [SPARK-41916][FOLLOW-UP] Restore python linter
    
    ### What changes were proposed in this pull request?
    Restore python linter
    
    ```
    starting black test...
    black checks failed:
    would reformat 
/__w/spark/spark/python/pyspark/ml/torch/tests/test_distributor.py
    ```
    
    ### Why are the changes needed?
    to restore CI
    
    ### Does this PR introduce _any_ user-facing change?
    no
    
    ### How was this patch tested?
    manually check
    
    ### Was this patch authored or co-authored using generative AI tooling?
    no
    
    Closes #53505 from zhengruifeng/reformat_test_distributor.
    
    Authored-by: Ruifeng Zheng <[email protected]>
    Signed-off-by: Ruifeng Zheng <[email protected]>
---
 python/pyspark/ml/torch/tests/test_distributor.py | 17 ++++++++++-------
 1 file changed, 10 insertions(+), 7 deletions(-)

diff --git a/python/pyspark/ml/torch/tests/test_distributor.py 
b/python/pyspark/ml/torch/tests/test_distributor.py
index 252b02028c6b..80dd9e67f3a8 100644
--- a/python/pyspark/ml/torch/tests/test_distributor.py
+++ b/python/pyspark/ml/torch/tests/test_distributor.py
@@ -306,17 +306,20 @@ class TorchDistributorBaselineUnitTestsMixin:
         )
         self.delete_env_vars(input_env_vars)
 
-    @patch.dict(os.environ, {
-        "CUDA_VISIBLE_DEVICES": "0,1,2,3",
-        "MASTER_ADDR": "11.22.33.44",
-        "MASTER_PORT": "6677",
-        "RANK": "1",
-    })
+    @patch.dict(
+        os.environ,
+        {
+            "CUDA_VISIBLE_DEVICES": "0,1,2,3",
+            "MASTER_ADDR": "11.22.33.44",
+            "MASTER_PORT": "6677",
+            "RANK": "1",
+        },
+    )
     def test_multi_gpu_node_get_torchrun_args(self):
         torchrun_args, processes_per_node = 
TorchDistributor._get_torchrun_args(False, 8)
         self.assertEqual(
             torchrun_args,
-            ['--nnodes=2', '--node_rank=1', 
'--rdzv_endpoint=11.22.33.44:6677', '--rdzv_id=0']
+            ["--nnodes=2", "--node_rank=1", 
"--rdzv_endpoint=11.22.33.44:6677", "--rdzv_id=0"],
         )
         self.assertEqual(processes_per_node, 4)
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to