[
https://issues.apache.org/jira/browse/HADOOP-12709?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Mingliang Liu updated HADOOP-12709:
-----------------------------------
Attachment: HADOOP-12709.000.patch
The v0 patch cuts the s3:// support in {{trunk}} branch.
Specially:
* The {{S3Credentials}} and {{S3Exception}} are used by both s3 and s3n. Thus
they were moved to s3n package instead of being deleted. The unit test for
{{S3Credentials}} is thus also renamed.
* Updated the
{{hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/index.md}}.
Did I miss anything?
> Deprecate s3:// in branch-2,; cut from trunk
> --------------------------------------------
>
> Key: HADOOP-12709
> URL: https://issues.apache.org/jira/browse/HADOOP-12709
> Project: Hadoop Common
> Issue Type: Improvement
> Components: fs/s3
> Affects Versions: 2.8.0
> Reporter: Steve Loughran
> Assignee: Mingliang Liu
> Attachments: HADOOP-12709.000.patch
>
>
> The fact that s3:// was broken in Hadoop 2.7 *and nobody noticed until now*
> shows that it's not being used. while invaluable at the time, s3n and
> especially s3a render it obsolete except for reading existing data.
> I propose
> # Mark Java source as {{@deprecated}}
> # Warn the first time in a JVM that an S3 instance is created, "deprecated
> -will be removed in future releases"
> # in Hadoop trunk we really cut it. Maybe have an attic project (external?)
> which holds it for anyone who still wants it. Or: retain the code but remove
> the {{fs.s3.impl}} config option, so you have to explicitly add it for use.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)