matthias-Q commented on code in PR #2173:
URL: https://github.com/apache/iceberg-python/pull/2173#discussion_r2216665172


##########
pyiceberg/utils/schema_conversion.py:
##########
@@ -69,8 +69,10 @@
 LOGICAL_FIELD_TYPE_MAPPING: Dict[Tuple[str, str], PrimitiveType] = {
     ("date", "int"): DateType(),
     ("time-micros", "long"): TimeType(),
+    ("timestamp-millis", "int"): TimestampType(),

Review Comment:
   Thanks for the response. I was preparing a PR that would add 
`TimestampMilli` similar to `TimestampNano`. I understand that is  not in the 
spec. 
   
   My initial use case was, that I want to use the Schema conversion functions 
to create an Iceberg table based of an Avro Schema. At the moment I use 
`AvroSchemaConversion.avro_to_iceberg().as_arrow()` to create the Arrow table 
that goes eventually into Iceberg. Maybe it would suffice to add some 
functionality here. Since my data is actually a python timestamp and that has 
microsecond precision anyways. As some context: I want to consume a Kafka topic 
and upsert that into an iceberg table.
   
   EDIT:
   Here is a gist 
https://gist.github.com/matthias-Q/87632a18301324e4bc3d02dd2c396210
   
   That also my explains my initial confidence that we do not need conversion.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to