eric-maynard commented on code in PR #1759:
URL: https://github.com/apache/polaris/pull/1759#discussion_r2116172756
##########
regtests/t_pyspark/src/test_spark_sql_s3_with_privileges.py:
##########
@@ -1203,23 +1203,26 @@ def
test_spark_credentials_s3_exception_on_metadata_file_deletion(root_client, s
assert metadata_contents['ContentLength'] > 0
# Delete metadata files
+ objects_to_delete = [{'Key': obj['Key']} for obj in objects['Contents']]
s3.delete_objects(Bucket=test_bucket,
- Delete={'Objects': objects})
+ Delete={'Objects': objects_to_delete})
try:
response = snowman_catalog_client.load_table(snowflake_catalog.name,
unquote('db1%1Fschema'),
"iceberg_table",
"vended-credentials")
except Exception as e:
- assert '404' in str(e)
+ # 400 error(BadRequest) is thrown when metadata file is missing
+ assert '400' in str(e)
Review Comment:
I think we should just make the test pass, we can always change the response
code later and then change the test again.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]