asfimport opened a new issue, #71:
URL: https://github.com/apache/arrow-go/issues/71

   I have created a small repro to illustrate this bug: 
https://gist.github.com/phillipleblanc/5e3e2d0e6914d276cf9fd79e019581de
   
   When writing a Decimal128 array to a Parquet file the pqarrow package will 
prefer to use DictFixedLenByteArrayEncoder. If the size of the array goes over 
some threshold, it will switch to using PlainFixedLenByteArrayEncoder.
   
   The DictFixedLenByteArrayEncoder tolerates null values in a Decimal128 array 
with the arrow schema set to Nullable: false, however the 
PlainFixedLenByteArrayEncoder will not tolerate null values and will panic.
   
   Having null values in an array marked as non-nullable is an issue in the 
user code - however, it was surprising that my buggy code was working some 
times and not working other times. I would expect the PlainFixedLen encoder to 
handle nulls the same way as the DictFixedLen encoder or for the DictFixedLen 
encoder to panic.
   
   An observation is that most other array types handle nulls with the schema 
marked as non-nullable when writing to Parquet; this was the first instance I 
found in the pqarrow package where having the Arrow schema marked as Nullable 
was necessary for Parquet writing arrays with null values. Again, debatable if 
this is desirable or not.
   
   **Reporter**: [Phillip 
LeBlanc](https://issues.apache.org/jira/browse/ARROW-17133)
   
   <sub>**Note**: *This issue was originally created as 
[ARROW-17133](https://issues.apache.org/jira/browse/ARROW-17133). Please see 
the [migration documentation](https://github.com/apache/arrow/issues/14542) for 
further details.*</sub>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@arrow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to