> On Feb. 10, 2017, 11:37 p.m., Greg Senia wrote:
> > Ship It!
>
> Greg Senia wrote:
> While using doAS this does not work...
>
> gint), stock_price_adj_close (type:
> float)"}}}}]}},"DagId:":"hive_20170210183719_81144261-1a2b-4159-9e62-dc7bad4ebfc7:1","DagName:":""}},"Stage-2":{"Dependency
> Collection":{}},"Stage-0":{"Move Operator":{"files:":{"hdfs
> directory:":"true","destination:":"hdfs://tech/apps/hive/warehouse/gss_test_gss_test"}}}}},
> endTime=Fri Feb 10 18:37:40 EST 2017}}]] after 3 retries. Quitting
> org.apache.kafka.common.KafkaException: Failed to construct kafka producer
> at
> org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:335)
> at
> org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:188)
> at
> org.apache.atlas.kafka.KafkaNotification.createProducer(KafkaNotification.java:311)
> at
> org.apache.atlas.kafka.KafkaNotification.sendInternal(KafkaNotification.java:220)
> at
> org.apache.atlas.notification.AbstractNotification.send(AbstractNotification.java:84)
> at
> org.apache.atlas.hook.AtlasHook.notifyEntitiesInternal(AtlasHook.java:134)
> at org.apache.atlas.hook.AtlasHook.notifyEntities(AtlasHook.java:119)
> at org.apache.atlas.hook.AtlasHook.notifyEntities(AtlasHook.java:172)
> at org.apache.atlas.hive.hook.HiveHook.access$300(HiveHook.java:85)
> at org.apache.atlas.hive.hook.HiveHook$3.run(HiveHook.java:224)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1740)
> at
> org.apache.atlas.hive.hook.HiveHook.notifyAsPrivilegedAction(HiveHook.java:233)
> at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:206)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException:
> javax.security.auth.login.LoginException: Could not login: the client is
> being asked for a password, but the Kafka client code does not currently
> support obtaining a password from the user. not available to garner
> authentication information from the user
> at
> org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86)
> at
> org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:71)
> at
> org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:83)
> at
> org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:277)
> ... 19 more
> Caused by: javax.security.auth.login.LoginException: Could not login: the
> client is being asked for a password, but the Kafka client code does not
> currently support obtaining a password from the user. not available to garner
> authentication information from the user
> at
> com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:940)
> at
> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760)
> at
> com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
> at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
> at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
> at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
> at java.security.AccessController.doPrivileged(Native Method)
> at
> javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
> at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
> at
> org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:69)
> at
> org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:110)
> at
> org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:46)
> at
> org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:68)
> at
> org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:78)
Here are changes to make it work correctly as UGI context can have both a
ticket and a keytab. Check for keytab first if not keytab go onto ticketcache
if it exists.
diff --git a/notification/src/main/java/org/apache/atlas/hook/AtlasHook.java
b/notification/src/main/java/org/apache/atlas/hook/AtlasHook.java
index a83db79..9ac88f4 100644
--- a/notification/src/main/java/org/apache/atlas/hook/AtlasHook.java
+++ b/notification/src/main/java/org/apache/atlas/hook/AtlasHook.java
@@ -26,6 +26,7 @@
import org.apache.atlas.notification.NotificationInterface;
import org.apache.atlas.notification.NotificationModule;
import org.apache.atlas.notification.hook.HookNotification;
+import org.apache.atlas.security.InMemoryJAASConfiguration;
import org.apache.atlas.typesystem.Referenceable;
import org.apache.atlas.typesystem.json.InstanceSerialization;
import org.apache.commons.configuration.Configuration;
@@ -76,6 +77,12 @@
if (logFailedMessages) {
failedMessagesLogger = new FailedMessagesLogger(failedMessageFile);
failedMessagesLogger.init();
+ }
+
+ if (!(isLoginKeytabBased())){
+ if (isLoginTicketBased()) {
+
InMemoryJAASConfiguration.setConfigSectionRedirect("KafkaClient",
"ticketBased-KafkaClient");
+ }
}
notificationRetryInterval =
atlasProperties.getInt(ATLAS_NOTIFICATION_RETRY_INTERVAL, 1000);
@@ -210,4 +217,27 @@
}
}
+ private static boolean isLoginTicketBased() {
+ boolean ret = false;
+
+ try {
+ ret = UserGroupInformation.isLoginTicketBased();
+ } catch (Exception excp) {
+ LOG.error("error in determining whether to use ticket-cache or
keytab for KafkaClient JAAS configuration", excp);
+ }
+
+ return ret;
+ }
+ private static boolean isLoginKeytabBased() {
+ boolean ret = false;
+
+ try {
+ ret = UserGroupInformation.isLoginKeytabBased();
+ } catch (Exception excp) {
+ LOG.error("error in determining whether to use ticket-cache or
keytab for KafkaClient JAAS configuration", excp);
+ }
+
+ return ret;
+ }
+
}
- Greg
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/56543/#review165205
-----------------------------------------------------------
On Feb. 10, 2017, 2:26 p.m., Nixon Rodrigues wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/56543/
> -----------------------------------------------------------
>
> (Updated Feb. 10, 2017, 2:26 p.m.)
>
>
> Review request for atlas, keval bhatt, Madhan Neethiraj, Suma Shivaprasad,
> and Vimal Sharma.
>
>
> Bugs: ATLAS-1546
> https://issues.apache.org/jira/browse/ATLAS-1546
>
>
> Repository: atlas
>
>
> Description
> -------
>
> In a kerberized environment, Atlas hook uses JAAS configuration section named
> "KakfaClient" to authenticate with Kafka broker. In a typical Hive deployment
> this configuration section is set to use the keytab and principal of
> HiveServer2 process. The hook running in HiveCLI might fail to authenticate
> with Kafka if the user can't read the configured keytab.
> Given that HiveCLI users would have performed kinit, the hook in HiveCLI
> should use the ticket-cache generated by kinit. When ticket cache is not
> available (for example in HiveServer2), the hook should use the configuration
> provided in KafkaClient JAAS section.
>
>
> Atlas Jaas properties
>
> atlas.jaas.ticketBased-KafkaClient.loginModuleControlFlag=required
> atlas.jaas.ticketBased-KafkaClient.loginModuleName=com.sun.security.auth.module.Krb5LoginModule
> atlas.jaas.ticketBased-KafkaClient.option.useTicketCache=true
>
>
> Diffs
> -----
>
>
> common/src/main/java/org/apache/atlas/security/InMemoryJAASConfiguration.java
> ff80eca
>
> common/src/test/java/org/apache/atlas/security/InMemoryJAASConfigurationTicketBasedKafkaClientTest.java
> PRE-CREATION
> common/src/test/resources/atlas-jaas.properties 90a5682
> notification/src/main/java/org/apache/atlas/hook/AtlasHook.java 0534910
>
> Diff: https://reviews.apache.org/r/56543/diff/
>
>
> Testing
> -------
>
> Maven build completed without issue & executed mvn clean install and all the
> testcases are passing except few.
> Added a new unit testcase for TicketBasedKafkaClient.
>
> Deployed the new jars ( atlas-common & atlas-notification
> in??/usr/hdp/current/atlas-client/hook/hive/atlas-hive-plugin-impl/ and
> tested hive hook on secure and simple env for HiveCli,Beeline clients.
> Entities on Atlas are created for tables created in Hive.
>
>
> Thanks,
>
> Nixon Rodrigues
>
>