[ 
https://issues.apache.org/jira/browse/HADOOP-19793?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18059973#comment-18059973
 ] 

ASF GitHub Bot commented on HADOOP-19793:
-----------------------------------------

ajfabbri commented on PR #8225:
URL: https://github.com/apache/hadoop/pull/8225#issuecomment-3937276224

   > a trivial checkstyle to fix
   I already fixed it but CI has been stuck for over 24 hours! 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8225/8/




> S3A: Regression: maximum size of a single upload is now only 2GB
> ----------------------------------------------------------------
>
>                 Key: HADOOP-19793
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19793
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs/s3
>    Affects Versions: 3.5.0, 3.4.1, 3.4.2
>            Reporter: Steve Loughran
>            Assignee: Aaron Fabbri
>            Priority: Minor
>              Labels: pull-request-available
>
> In HADOOP-19221 the max size of a single block was made an integer, even if 
> the source is a file > 2GB long. This means that uploads as a single block no 
> longer work. This is relevant when working with stores like GCS which don't 
> support multipart uploads.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to