andygrove opened a new pull request, #4079:
URL: https://github.com/apache/datafusion-comet/pull/4079
## Which issue does this PR close?
Closes #.
## Rationale for this change
The installation guide currently calls Spark 4.0 support "experimental" and
tells users it should not be used in production. CI now runs the Comet test
suite and the Spark SQL test suite against Spark 4.0, and the known gaps are
limited to a small set of features that fall back to Spark cleanly. The wording
overstates the risk and understates the coverage. We also have no single place
that documents the per-Spark-version gaps, so users have to infer them from
issue trackers and per-feature notes.
## What changes are included in this PR?
- Add a new `compatibility/spark-versions.md` page documenting:
- Spark 4.0 ANSI mode coverage with fallback for unsupported cases
- `VariantType` not yet supported (falls back)
- Partial coverage of Parquet type widening (falls back for unsupported
widenings)
- A note that 3.4 / 3.5 have no version-specific gaps beyond the
per-feature notes
- Wire the new page into `compatibility/index.md`
- In `installation.md`:
- Merge the Spark 4.0.1 row into the main supported-versions table
- Replace the "experimental ... not for production" paragraph with a
positive note that links to the new compatibility page
- Drop the "(Experimental)" suffix from the Spark 4.0 JAR link
## How are these changes tested?
Documentation only; no code changes. Reviewed locally for relative-link
correctness and toctree wiring.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]