andygrove commented on code in PR #4039:
URL: https://github.com/apache/datafusion-comet/pull/4039#discussion_r3127477597
##########
spark/src/main/scala/org/apache/comet/serde/datetime.scala:
##########
@@ -295,11 +295,9 @@ object CometSecond extends CometExpressionSerde[Second] {
object CometUnixTimestamp extends CometExpressionSerde[UnixTimestamp] {
private def isSupportedInputType(expr: UnixTimestamp): Boolean = {
- // Note: TimestampNTZType is not supported because Comet incorrectly
applies
- // timezone conversion to TimestampNTZ values. TimestampNTZ stores local
time
- // without timezone, so no conversion should be applied.
expr.children.head.dataType match {
case TimestampType | DateType => true
+ case dt if dt.typeName == "timestamp_ntz" => true
Review Comment:
We can just just match on `TimestampNTZType` rather than use `typeName`. I
know we had to do this in the past, but Spark 3.4 and later have
`TimestampNTZType`
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]