Hi, this is a supported use case. Please make sure you configure the KMS proxy user correctly as well (it is separately from HDFS proxy user settings) https://hadoop.apache.org/docs/current/hadoop-kms/index.html#KMS_Proxyuser_Configuration
On Thu, Aug 2, 2018 at 12:30 PM Ashish Tadose <[email protected]> wrote: > Hi, > > Does HDFS user impersonation work on HDFS encrypted zone backed by ranger > KMS? > > Our Hadoop environment configured with Kerberos and also supports creating > an encrypted zone in HDFS by ranger KMS. > > Specific application id has HDFS user impersonation access to impersonate > users of a certain group which works flawlessly on normal HDFS folders, > however same not working on encrypted zones. > > PFB - Masked log extract > > WARN kms.LoadBalancingKMSClientProvider: KMS provider at [<host>/kms/v1/] > threw an IOException!! java.io.IOException: > org.apache.hadoop.security.authentication.client.AuthenticationException: > Authentication failed, URL: > <host>/kms/v1/keyversion/<group>%400/_eek?eek_op=decrypt&doAs=<user1>& > user.name=<service-user>, status: 403, message: Forbidden > at > org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551) > at > org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831) > at > org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:207) > at > org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:203) > at > org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:95) > at > org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:203) > at > org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388) > at > org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393) > at > org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463) > at > org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333) > at > org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327) > at > org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) > at > org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340) > at com.wandisco.fs.client.ReplicatedFC.open(ReplicatedFC.java:752) > at com.wandisco.fs.client.ReplicatedFC.xlateAndOpen(ReplicatedFC.java:377) > at com.wandisco.fs.client.FusionHdfs.open(FusionHdfs.java:452) > at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786) > at <masked-package>.EncryptFsTest.readFile(EncryptFsTest.java:118) > at <masked-package>.EncryptFsTest$1.run(EncryptFsTest.java:71) > at <masked-package>.kerberos.EncryptFsTest$1.run(EncryptFsTest.java:69) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) > > Thanks in advance. > > Regards, > Ashish > > -- > A very happy Hadoop contributor >
