vimalKeshu opened a new issue, #9326:
URL: https://github.com/apache/iceberg/issues/9326

   ### Apache Iceberg version
   
   1.4.2 (latest release)
   
   ### Query engine
   
   Spark
   
   ### Please describe the bug 🐞
   
   I am trying spark iceberg with rest catalog in local mode.
   Below is my spark code:
   ```
     val catalogName = "datalake"
     val databaseName = "person"
     val tableName = "person_info_7"
     val restUrl = "http://127.0.0.1:8181";
     val s3Endpoint= "http://127.0.0.1:9000";
   
   
     val spark = SparkSession
       .builder()
       .appName(this.getClass.getName)
       .master("local[*]")
       .config("spark.sql.extensions", 
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
       .config(s"spark.sql.catalog.${catalogName}", 
"org.apache.iceberg.spark.SparkCatalog")
       .config(s"spark.sql.catalog.$catalogName.type", "rest")
       .config(s"spark.sql.catalog.$catalogName.uri", restUrl)
       .config("spark.sql.parquet.outputTimestampType", "TIMESTAMP_MICROS")
       .config(s"spark.sql.catalog.$catalogName.hadoop.fs.s3a.impl", 
"org.apache.hadoop.fs.s3a.S3AFileSystem")
       
.config(s"spark.sql.catalog.$catalogName.hadoop.fs.s3a.path.style.access", 
"true")
       
.config(s"spark.sql.catalog.$catalogName.hadoop.fs.s3a.aws.credentials.provider",
 "org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider")
       .config(s"spark.sql.catalog.$catalogName.hadoop.fs.s3a.access.key", 
s3AccessKey)
       .config(s"spark.sql.catalog.$catalogName.hadoop.fs.s3a.secret.key", 
s3SecretKey)
       .config(s"spark.sql.catalog.$catalogName.hadoop.fs.s3a.endpoint", 
s3Endpoint)
       .getOrCreate()
   
   spark.sql("show CATALOGS").show()
   spark.sql("SHOW CURRENT NAMESPACE").show()
   ```
   
   The output is:
   ```
   +-------------+
   |      catalog|
   +-------------+
   |spark_catalog|
   +-------------+
   
   +-------------+---------+
   |      catalog|namespace|
   +-------------+---------+
   |spark_catalog|  default|
   +-------------+---------+
   ```
   
   here is version & lib details:
   ```
   scalaVersion := "2.12.17"
   
   val sparkVersion = "3.4.0"
   val hadoopAwsVersion = "3.1.2"
   val awsJavaSdkBundleVersion = "1.11.199"
   
   "org.apache.iceberg" % "iceberg-spark-runtime-3.4_2.12" % "1.4.2"
   ```
   
   As per my understanding of the spark iceberg, if i don't add below config, 
it should consider the given catalog name, datalake, instead of spark_catalog. 
Somehow, it's not happening.
   ```
   .config(s"spark.sql.catalog.${catalogName}", 
"org.apache.iceberg.spark.SparkSessionCatalog")
   ```
   
   I mean expected output should be below:
   ```
   +-------------+
   |      catalog|
   +-------------+
   |datalake     |
   +-------------+
   
   +-------------+---------+
   |      catalog|namespace|
   +-------------+---------+
   |datalake     |  default|
   +-------------+---------+
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to