Knorreman opened a new issue, #12371:
URL: https://github.com/apache/iceberg/issues/12371

   ### Apache Iceberg version
   
   1.8.0 (latest release)
   
   ### Query engine
   
   Spark
   
   ### Please describe the bug 🐞
   
   I am trying to upgrade to spark 3.5.4 and iceberg 1.8.0. But I run into this 
problem:
   ```
   Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/spark/sql/catalyst/expressions/AnsiCast
           at 
org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions.$anonfun$apply$6(IcebergSparkSessionExtensions.scala:54)
           at 
org.apache.spark.sql.SparkSessionExtensions.$anonfun$buildResolutionRules$1(SparkSessionExtensions.scala:215)
           at 
scala.collection.StrictOptimizedIterableOps.map(StrictOptimizedIterableOps.scala:100)
           at 
scala.collection.StrictOptimizedIterableOps.map$(StrictOptimizedIterableOps.scala:87)
           at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:43)
           at 
org.apache.spark.sql.SparkSessionExtensions.buildResolutionRules(SparkSessionExtensions.scala:215)
           at 
org.apache.spark.sql.internal.BaseSessionStateBuilder.customResolutionRules(BaseSessionStateBuilder.scala:222)
           at 
org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:96)
           at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:85)
           at 
org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$build$2(BaseSessionStateBuilder.scala:375)
           at 
org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:92)
           at 
org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:92)
           at 
org.apache.spark.sql.internal.SessionState.catalogManager(SessionState.scala:102)
           at 
org.apache.spark.sql.internal.CatalogImpl.currentDatabase(CatalogImpl.scala:68)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.currentDB$1(SparkSQLCLIDriver.scala:288)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.promptWithCurrentDB$1(SparkSQLCLIDriver.scala:295)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:299)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
           at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.base/java.lang.reflect.Method.invoke(Method.java:569)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1034)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.sql.catalyst.expressions.AnsiCast
           at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
           at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
           at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
           ... 30 more
   
   ```
   
   This looks similar to an error described in 
https://github.com/apache/iceberg/issues/8926 but no conclusion was given. 
Likewise this stackoverflow question 
https://stackoverflow.com/questions/76748369/unable-to-install-iceberg-extensions-for-pyspark-and-use-merge-into
 did not resolve my problem since I provide both runtime and extensions jars in 
the example below.
   
   Minimal reproducible example:
   ```
   spark-sql --packages 
org.apache.iceberg:iceberg-spark-runtime-3.5_2.13:1.8.0,org.apache.iceberg:iceberg-spark-extensions-3.5_2.13:1.8.0
 --master local --conf 
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
   ```
   And I use this spark version
   ```
   spark-submit --version
   Welcome to
         ____              __
        / __/__  ___ _____/ /__
       _\ \/ _ \/ _ `/ __/  '_/
      /___/ .__/\_,_/_/ /_/\_\   version 3.5.4
         /_/
                           
   Using Scala version 2.13.8, OpenJDK 64-Bit Server VM, 17.0.14
   ```
   
   ### Willingness to contribute
   
   - [ ] I can contribute a fix for this bug independently
   - [ ] I would be willing to contribute a fix for this bug with guidance from 
the Iceberg community
   - [ ] I cannot contribute a fix for this bug at this time


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to