[
https://issues.apache.org/jira/browse/HADOOP-12914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15199518#comment-15199518
]
Kihwal Lee commented on HADOOP-12914:
-------------------------------------
HDFS-8068 is a hdfs-specific solution that makes creation of namenode rpc proxy
fail when an unresolved address is used.
HADOOP-12125 aims to be more generic solution.
> RPC client should deal with the IP address change
> -------------------------------------------------
>
> Key: HADOOP-12914
> URL: https://issues.apache.org/jira/browse/HADOOP-12914
> Project: Hadoop Common
> Issue Type: Bug
> Components: ipc
> Affects Versions: 2.7.2
> Environment: CentOS 7
> Reporter: Michiel Vanderlee
>
> I'm seeing HADOOP-7472 again for the datanode in v2.7.2.
> If I start the datanode before the dns entry the namenode resolve, it never
> retries to resolve and keeps failing with a UnknownHostException.
> A restart or the datanode fixes this.
> TRACE ipc.ProtobufRpcEngine: 31: Exception <-
> nn1.hdfs-namenode-rpc.service.consul:8020: versionRequest
> {java.net.UnknownHostException: Invalid host name: local host is: (unknown);
> destination host is: "nn1.hdfs-namenode-rpc.service.consul":8020;
> java.net.UnknownHostException; For more details see:
> http://wiki.apache.org/hadoop/UnknownHost}
> The error comes from:
> org.apache.hadoop.ipc.Client..java$Connection line: 409
> public Connection(ConnectionId remoteId, int serviceClass) throws IOException
> {
> this.remoteId = remoteId;
> this.server = remoteId.getAddress();
> if (server.isUnresolved()) {
> throw NetUtils.wrapException(server.getHostName(),
> server.getPort(),
> null,
> 0,
> new UnknownHostException());
> }
> The remoteId.address (InetSocketAddress) seems to only resolves on creation,
> never again unless done manually.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)