kevinjqliu commented on code in PR #2173:
URL: https://github.com/apache/iceberg-python/pull/2173#discussion_r2223441722


##########
pyiceberg/utils/schema_conversion.py:
##########
@@ -69,8 +69,10 @@
 LOGICAL_FIELD_TYPE_MAPPING: Dict[Tuple[str, str], PrimitiveType] = {
     ("date", "int"): DateType(),
     ("time-micros", "long"): TimeType(),
+    ("timestamp-millis", "int"): TimestampType(),

Review Comment:
   > That is a bit annoying for me now, because there is no other proper schema 
conversion functionality out there. Which is really surprising since avro, 
arrow and iceberg are all Apache top level projects and interoperability seems 
to be very lacking.
   
   i agree! lets address this part. i'll create an issue to track. i think we 
can provide helper functions for data type conversion. 
   
   > the conversion from ms -> µs is lossless and safe. Python's datetime is in 
µs anyways and even writing a pyarrow.timestamp('ms') into an iceberg table 
implicitly converts it from ms to µs. I think this is a safe conversion.
   
   i also believe it is a safe conversion. i think we should figure out a 
balance between providing the conversion functionality and adhering to the 
spec. The naming of `AvroSchemaConversion().avro_to_iceberg` is confusing in 
this sense, since its specifically for converting avro data types to those 
supported by iceberg spec 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to