djouallah opened a new issue, #2346:
URL: https://github.com/apache/iceberg-python/issues/2346
### Feature Request / Improvement
using the latest dev release, i can now write to onelake, and that's
awesome, but i have a niche feature request :) , I use sqlite as a metastore,
and it is located in a blobfuse path, the db get flushed to the persistent
storage only when the connection to sqlite is closed, is there a way to have
something like catalog.close() to automatically close the connection, sorry if
the request does not make sense :)
```
from pyiceberg.catalog.sql import SqlCatalog
import sqlite3
import duckdb
import os
ws = "iceberg"
lh = "data"
schema = "first"
tbl = 'new'
warhouse_path =
f'abfss://{ws}@onelake.dfs.fabric.microsoft.com/{lh}.Lakehouse/Tables'
catalog = SqlCatalog(
"default",
**{
"uri" :
"sqlite://///lakehouse/default/Files/pyiceberg.db",
"adls.account-name" : 'onelake' ,
"adls.account-host" : "onelake.blob.fabric.microsoft.com" ,
"adls.token" :
os.environ.get('AZURE_STORAGE_TOKEN') ,
"warehouse" : warhouse_path
},
)
catalog.create_namespace_if_not_exists(schema)
catalog.list_tables(schema)
#Write Data
df=duckdb.sql(""" SELECT cast(unnest(generate_series(cast ('2018-04-01' as
date), cast('2024-12-31' as date), interval 1 day)) as date) as date,
EXTRACT(year from date) as year,
EXTRACT(month from date) as month
""").arrow()
table =
catalog.create_table_if_not_exists(f"{schema}.{tbl}",schema=df.schema )
table.overwrite(df)
### cose sqlite to make sure the db is stored in onelake
sqlite3.connect("/lakehouse/default/Files/pyiceberg.db").close()
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]