2MD opened a new issue, #12131:
URL: https://github.com/apache/iceberg/issues/12131

   ### Apache Iceberg version
   
   1.7.1 (latest release)
   
   ### Query engine
   
   Spark
   
   ### Please describe the bug 🐞
   
   ```
   val icebergVersion = "1.7.1"
   val sparkVersion = "3.3.2"
   
   lazy val icebergDependencies = Seq(
   "org.apache.iceberg" %% "iceberg-spark-runtime-3.3" % icebergVersion,
   "org.scala-lang.modules" % "scala-collection-compat_2.12" % "2.12.0",
   "org.apache.iceberg" % "iceberg-aws-bundle" % icebergVersion,
   "org.apache.iceberg" % "iceberg-hive-metastore" % icebergVersion
   )
   "org.apache.hive" % "hive-metastore" % "3.1.3" % Provided,
   "org.apache.spark" %% "spark-hive" % sparkVersion % Provided,
   ("org.apache.hive" % "hive-exec" % "3.1.3").classifier("core") % Provided,
   ("org.apache.iceberg" % "iceberg-hive-metastore" % icebergVersion % 
IntegrationTest).classifier("tests")
   
   ```
   I put hive-schema-3.1.0.derby.sql in my resources folder
   project/
   ├── src/
   │ ├── it/
   │ │ ├── resources/
   │ │ │ └── hive-schema-3.1.0.derby.sql
   
   I have some test
   ```
   var metastore: TestHiveMetastore = _
     var uris: String = _
     implicit var spark: SparkSession = _
   
     override def beforeAll(): Unit = {
       super.beforeAll()
       metastore = new TestHiveMetastore()
       metastore.start()
       uris = metastore.hiveConf().get("hive.metastore.uris")
       spark = SparkSession.builder
         .master("local[*]")
         .config("spark.hadoop.hive.metastore.uris", uris)
         .config("spark.sql.legacy.parquet.nanosAsLong", "false")
         .config("spark.sql.extensions", 
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
         .config(s"spark.sql.catalog.$icebergCatalog", 
"org.apache.iceberg.spark.SparkCatalog")
         .config(s"spark.sql.catalog.$icebergCatalog.type", "hive")
         .config("spark.hadoop.fs.s3a.endpoint", minioContainer.getHostAddress)
         .config("spark.hadoop.fs.s3a.access.key", 
minioContainer.getMinioAccessValue)
         .config("spark.hadoop.fs.s3a.secret.key", 
minioContainer.getMinioAccessValue)
         .config("spark.hadoop.fs.s3a.path.style.access", "true")
         .config("spark.hadoop.fs.s3a.impl", 
"org.apache.hadoop.fs.s3a.S3AFileSystem")
         .config("spark.sql.iceberg.check-ordering", "false")
         .config("write.metadata.delete-after-commit.enabled", "false")
         .config("spark.sql.iceberg.handle-timestamp-without-timezone", "true")
         .enableHiveSupport()
         .getOrCreate()
       s3.createBucket(bucket)
     }
   ```
   
   I run my test : > sbt "IntegrationTest/testOnly"
   
   ```
   Caused by: java.lang.NullPointerException
   [error] at java.base/java.io.Reader.(Reader.java:167)
   [error] at java.base/java.io.InputStreamReader.(InputStreamReader.java:72)
   [error] at 
org.apache.iceberg.hive.TestHiveMetastore.setupMetastoreDB(TestHiveMetastore.java:281)
   [error] at 
org.apache.iceberg.hive.TestHiveMetastore.(TestHiveMetastore.java:106)
   [error] at 
ru.samokat.datalake.loader.BaseLoadSuite.beforeAll(BaseLoadSuite.scala:49)
   [error] at 
ru.samokat.datalake.loader.BaseLoadSuite.beforeAll$(BaseLoadSuite.scala:46)
   ```
   He can't find hive-schema-3.1.0.derby.sql because in code loaded file across 
SystemClassLoader
   
   ### Willingness to contribute
   
   - [x] I can contribute a fix for this bug independently
   - [ ] I would be willing to contribute a fix for this bug with guidance from 
the Iceberg community
   - [ ] I cannot contribute a fix for this bug at this time


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to