andygrove opened a new pull request, #4104:
URL: https://github.com/apache/datafusion-comet/pull/4104

   ## Which issue does this PR close?
   
   Closes #4102 (Option A).
   
   ## Rationale for this change
   
   The macOS PR workflow currently runs 3 profiles (Spark 3.4 / JDK 11 / Scala 
2.12, Spark 3.5 / JDK 17 / Scala 2.13, Spark 4.0 / JDK 17 / Scala 2.13) across 
7 suites, for 21 jobs per PR. macOS runners cost roughly 2x Linux minutes, 
making this the most expensive workflow per job.
   
   The macOS workflow exists primarily to catch platform-specific issues on 
Apple Silicon: native library loading, FFI, threading, and shuffle. Those 
concerns are largely independent of the Spark/Scala/JDK version. Full 
Spark/Java/Scala matrix coverage already happens on Linux in 
`pr_build_linux.yml`, so running 3 Spark profiles on macOS is duplicative for 
everything that is not platform-sensitive.
   
   ## What changes are included in this PR?
   
   - Remove the `Spark 3.4, JDK 11, Scala 2.12` and `Spark 3.5, JDK 17, Scala 
2.13` matrix entries from `.github/workflows/pr_build_macos.yml`, keeping only 
`Spark 4.0, JDK 17, Scala 2.13`.
   - Drop the now-dead conditional that skipped the `sql` suite under the Spark 
3.4 profile.
   - Update the comment on the `profile` block to explain the macOS-specific 
rationale.
   
   Reduces the macOS PR matrix from 21 jobs to 7.
   
   ## How are these changes tested?
   
   CI on this PR exercises the new matrix. If a regression slips through that 
depends on Spark version on macOS, we can add a second profile back.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to