james-willis opened a new pull request, #4060:
URL: https://github.com/apache/datafusion-comet/pull/4060

   ## Which issue does this PR close?
   
   Closes #4059
   
   ## Rationale for this change
   
   Spark 4 officially supports both Java 17 and 21 
([ref](https://spark.apache.org/docs/4.0.1/)). The current Comet CI and 
compatibility docs only cover Java 17 for Spark 4.0, which leaves Java 21 — an 
LTS release, widely adopted across the Spark / Arrow / Netty / Iceberg / Delta 
stack — without official validation.
   
   This PR adds Java 21 as a supported runtime for the experimental Spark 4.0 
tier so downstream consumers can adopt it with an upstream signal.
   
   ## What changes are included in this PR?
   
   - `.github/workflows/pr_build_linux.yml`: add a `Spark 4.0, JDK 21` profile 
next to the existing `Spark 4.0, JDK 17` profile in both matrix sections. 
Extend the conditional `JAVA_TOOL_OPTIONS` so the existing `--add-opens` / 
`--add-exports` flags also apply under JDK 21 (they remain valid — same as 
Spark 4's own launcher scripts set for both JDK versions).
   - `.github/workflows/spark_sql_test.yml`: add `{spark-short: '4.0', 
spark-full: '4.0.1', java: 21, scan-impl: 'auto'}` to the `config:` matrix. 
Mirror the existing `sql_hive-1` exclude (tracked under #2946) for the new JDK 
21 row.
   - `docs/source/user-guide/latest/installation.md`: update the Spark 4.0.1 
row of the experimental compatibility table from `17` to `17/21`.
   
   Skipped because they have no Spark 4 rows to mirror:
   - `spark_sql_test_native_iceberg_compat.yml` (Spark 3.4/3.5 × Java 11 only)
   - `iceberg_spark_test.yml` (Spark 3.4/3.5 × Java 11/17 only)
   
   ## How are these changes tested?
   
   By the CI itself — this PR's purpose is to add the test profile. The new 
matrix rows exercise the existing full test suite under a JDK 21 runtime; any 
genuine incompatibility surfaces as a row-specific failure. If the JDK 21 rows 
go green, Comet can legitimately claim JDK 21 support for Spark 4 in the 
compatibility matrix.
   
   The `--add-opens` / `--add-exports` flag set is the same one Spark 4 ships 
in its launcher scripts for both JDK 17 and 21, so no flag divergence is 
expected.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to