singhpk234 commented on code in PR #1759:
URL: https://github.com/apache/polaris/pull/1759#discussion_r2115239846


##########
regtests/t_pyspark/src/test_spark_sql_s3_with_privileges.py:
##########
@@ -1203,23 +1203,26 @@ def 
test_spark_credentials_s3_exception_on_metadata_file_deletion(root_client, s
     assert metadata_contents['ContentLength'] > 0
 
     # Delete metadata files
+    objects_to_delete = [{'Key': obj['Key']} for obj in objects['Contents']]
     s3.delete_objects(Bucket=test_bucket,
-                      Delete={'Objects': objects})
+                      Delete={'Objects': objects_to_delete})
 
     try:
         response = snowman_catalog_client.load_table(snowflake_catalog.name, 
unquote('db1%1Fschema'),
                                                      "iceberg_table",
                                                      "vended-credentials")
     except Exception as e:
-        assert '404' in str(e)
+        # 400 error(BadRequest) is thrown when metadata file is missing
+        assert '400' in str(e)

Review Comment:
   I see, 400 sounds more reasonable than 404 considering IRC spec says 
   
   400 : 
   
   Indicates a bad request error. It could be caused by an unexpected request 
body format or other forms of request validation failure, such as _**invalid 
json**_. Usually serves application/json content, although in some cases simple 
text/plain content might be returned by the server's middleware.
   
   404 : 
   Not Found - NoSuchTableException, table to load does not exist
   
   Though in this case Table does exists but the path is dropped, which to the 
best of my understanding qualifies for 400.
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to