andygrove opened a new pull request, #4334:
URL: https://github.com/apache/datafusion-comet/pull/4334

   ## Which issue does this PR close?
   
   No specific issue. Experiment.
   
   ## Rationale for this change
   
   This is the second of two parallel smoke tests for the Comet JVM UDF 
framework. Both add the same DateFormat \`CometUDF\` and \`ScalaTest\` suite. 
They differ only in *which* Arrow package the UDF imports.
   
   - #4333 sits on top of #4325 and uses unshaded \`org.apache.arrow.*\`. That 
branch already passes locally.
   - This PR targets \`apache/main\` directly (no refactor) and uses **shaded** 
imports: \`org.apache.comet.shaded.arrow.*\`.
   
   The question the experiment answers: can a UDF author on current \`main\` 
sidestep the Arrow shading boundary by importing the relocated class names that 
the shade plugin produces at package time? The probable failure mode is that 
the shaded namespace does not exist in the reactor's pre-shade classpath, so 
the spark module fails to compile. CI will confirm.
   
   ## What changes are included in this PR?
   
   - \`spark/src/main/scala/org/apache/comet/udf/builtin/DateFormatUdf.scala\`: 
\`CometUDF\` impl using \`import 
org.apache.comet.shaded.arrow.vector.{DateDayVector, ValueVector, 
VarCharVector}\`.
   - \`spark/src/test/scala/org/apache/comet/udf/DateFormatUdfSuite.scala\`: 
ScalaTest suite that allocates shaded Arrow vectors and exercises the UDF 
directly.
   - \`.github/workflows/pr_build_{linux,macos}.yml\`: registers the new suite 
under the \`expressions\` job.
   
   ## How are these changes tested?
   
   Pushed directly to CI as a probe.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to