[
https://issues.apache.org/jira/browse/HADOOP-19733?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18032528#comment-18032528
]
Brandon commented on HADOOP-19733:
----------------------------------
Ok, will also update the custom signer loading to use the configuration, for
consistency.
> S3A: Credentials provider classes not found despite setting
> `fs.s3a.classloader.isolation` to `false`
> -----------------------------------------------------------------------------------------------------
>
> Key: HADOOP-19733
> URL: https://issues.apache.org/jira/browse/HADOOP-19733
> Project: Hadoop Common
> Issue Type: Bug
> Components: fs/s3
> Affects Versions: 3.4.2
> Reporter: Brandon
> Assignee: Brandon
> Priority: Minor
> Labels: pull-request-available
>
> HADOOP-18993 added the option `fs.s3a.classloader.isolation` to support, for
> example, a Spark job using an AWS credentials provider class that is bundled
> into the Spark job JAR. In testing this, the AWS credentials provider classes
> are still not found.
> I think the cause is:
> * `fs.s3a.classloader.isolation` is implemented by setting (or not setting)
> a classloader on the `Configuration`
> * However, code paths to load AWS credential provider call
> `S3AUtils.getInstanceFromReflection`, which uses the classloader that loaded
> the S3AUtils class. That's likely to be the built-in application classloader,
> which won't be able to load classes in a Spark job JAR.
> And the fix seems small:
> * Change `S3AUtils.getInstanceFromReflection` to load classes using the
> `Configuration`'s classloader. Luckily we already have the Configuration in
> this method.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]