WillAyd opened a new issue, #2173:
URL: https://github.com/apache/arrow-adbc/issues/2173

   ### What happened?
   
   When attempting to drop a table via `cursor.execute`, BigQuery throws:
   
   ```sh
   OperationalError: UNKNOWN: [BigQuery] bigquery: require storage read API to 
be enabled
   ```
   
   However, `"bigquerystorage.googleapis.com"` is enabled in my project. I even 
tried "cloudstorage.googleapis.com" but the error persisted
   
   ### Stack Trace
   
   ```
   OperationalError                          Traceback (most recent call last)
   Cell In[14], line 15
         7 tbl = pa.Table.from_pydict({"x": [0, 1, 2], "y": [3, 4, 5]})
         9 with adbc_driver_bigquery.dbapi.connect(db_kwargs) as conn, 
conn.cursor() as cur:
        10     # BigQuery adbc_ingest does not work, so we use traditional
        11     # prepare / bind / execute approach
        12 
        13     # DROP TABLE might also be broken - complains about
        14     # OperationalError: UNKNOWN: [BigQuery] bigquery: require 
storage read API to be enabled
   ---> 15     cur.execute("DROP TABLE IF EXISTS demo_dataset.foo")
   
   File 
~/clones/arrow-adbc/python/adbc_driver_manager/adbc_driver_manager/dbapi.py:698,
 in Cursor.execute(self, operation, parameters)
       682 """
       683 Execute a query.
       684 
      (...)
       694     parameters, which will each be bound in turn).
       695 """
       696 self._prepare_execute(operation, parameters)
   --> 698 handle, self._rowcount = _blocking_call(
       699     self._stmt.execute_query, (), {}, self._stmt.cancel
       700 )
       701 self._results = _RowIterator(
       702     self._stmt, 
_reader.AdbcRecordBatchReader._import_from_c(handle.address)
       703 )
   
   File 
~/clones/arrow-adbc/python/adbc_driver_manager/adbc_driver_manager/_lib.pyx:1569,
 in adbc_driver_manager._lib._blocking_call_impl()
   
   File 
~/clones/arrow-adbc/python/adbc_driver_manager/adbc_driver_manager/_lib.pyx:1562,
 in adbc_driver_manager._lib._blocking_call_impl()
   
   File 
~/clones/arrow-adbc/python/adbc_driver_manager/adbc_driver_manager/_lib.pyx:1213,
 in adbc_driver_manager._lib.AdbcStatement.execute_query()
   
   File 
~/clones/arrow-adbc/python/adbc_driver_manager/adbc_driver_manager/_lib.pyx:260,
 in adbc_driver_manager._lib.check_error()
   
   OperationalError: UNKNOWN: [BigQuery] bigquery: require storage read API to 
be enabled
   ```
   
   ### How can we reproduce the bug?
   
   ```
   import adbc_driver_bigquery.dbapi
   from adbc_driver_bigquery import DatabaseOptions
   import pyarrow as pa
   
   db_kwargs = {
       DatabaseOptions.PROJECT_ID.value: "some-demo-project-1234",
       DatabaseOptions.DATASET_ID.value: "demo_dataset",
       DatabaseOptions.TABLE_ID.value: "foo",
   }
   
   tbl = pa.Table.from_pydict({"x": [0, 1, 2], "y": [3, 4, 5]})
   
   with adbc_driver_bigquery.dbapi.connect(db_kwargs) as conn, conn.cursor() as 
cur:
       # BigQuery adbc_ingest does not work, so we use traditional
       # prepare / bind / execute approach
   
       # DROP TABLE might also be broken - complains about
       # OperationalError: UNKNOWN: [BigQuery] bigquery: require storage read 
API to be enabled
       cur.execute("DROP TABLE IF EXISTS demo_dataset.foo")
   ```
   
   ### Environment/Setup
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@arrow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to