github-actions[bot] commented on issue #8419:
URL: https://github.com/apache/iceberg/issues/8419#issuecomment-2350730715
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occurs.
ExplorData24 commented on issue #8419:
URL: https://github.com/apache/iceberg/issues/8419#issuecomment-1847856969
@palanik1
@di2mot
@maulanaady
@RussellSpitzer
@dacort
Hello.
I am using Hive Catalog to create Iceberg tables with Spark as the execution
engine:
di2mot commented on issue #8419:
URL: https://github.com/apache/iceberg/issues/8419#issuecomment-1755796639
This works for me in general it works:
```
("spark.jars.packages", "org.apache.iceberg:iceberg-spark3:0.11.0"),
("spark.sql.extensions",
"org.apache.iceberg.spark.extensions.I
RussellSpitzer commented on issue #8419:
URL: https://github.com/apache/iceberg/issues/8419#issuecomment-1755633933
Pyspark I think has some issues with setting "packages" in the Spark conf
since the py4j execution means that the Spark Context has to be started a bit
weirdly. I would try us
di2mot commented on issue #8419:
URL: https://github.com/apache/iceberg/issues/8419#issuecomment-1752621192
Are you sure it's this way
`sparkConf=(SparkConf()
.set("spark.jars.packages",
"org.apache.iceberg:iceberg-spark-runtime-3.3_2.12:1.3.1,software.amazon.awssdk:bundle:2.20.18,sof