1316147945 commented on issue #9488:
URL: https://github.com/apache/iceberg/issues/9488#issuecomment-2434044363
COALESCE(column,0) maybe useful, and SET
`spark.sql.iceberg.check-nullability`=`false` + COALESCE(column,0) can solve
problem
--
This is an automated message from the Apache
github-actions[bot] commented on issue #9488:
URL: https://github.com/apache/iceberg/issues/9488#issuecomment-2408755005
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occurs.
nastra commented on issue #9488:
URL: https://github.com/apache/iceberg/issues/9488#issuecomment-1898787154
Sorry it wasn't clear from the description what the goal was and I
overlooked the usage of `spark.sql.iceberg.check-nullability`.
It's difficult to tell why it doesn't work with
abharath9 commented on issue #9488:
URL: https://github.com/apache/iceberg/issues/9488#issuecomment-1896331390
@nastra Yes i am aware of that. How do i write optional fields data to the
mandatory fields data. It is mentioned in this issue that it is possible by
setting "spark.sql.iceberg.ch
abharath9 opened a new issue, #9488:
URL: https://github.com/apache/iceberg/issues/9488
Throwing following error when trying into insert data into the Iceberg table
with not-null columns constraints.
**_Cannot write nullable values to non-null column 'id' exception_**
Here is a sa