Hi Jian Feng,

Could you please check your code and see any possibilities of simultaneous
access to the same file. Mostly this situation happens when multiple
clients tries to access the same file.

Code Reference:- https://github.com/apache/hadoop/blob/branch-2.2/hadoop-
hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/
hdfs/server/namenode/FSNamesystem.java#L2737

Best Regards,
Rakesh
Intel

On Mon, Oct 17, 2016 at 7:16 AM, Zhang Jianfeng <[email protected]> wrote:

> Hi ,
>
>     I hit an wired error. On our hadoop cluster (2.2.0), occasionally a
> LeaseExpiredException is thrown.
>
> The stacktrace is as below:
>
>
> *org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException):
> No lease on /user/biadmin/analytic‐root/SX5XPWPPDPQH/.executions/.at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:2737)*
>
> *at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFileInternal(FSNamesystem.java:2801)*
>
> *at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFile(FSNamesystem.java:2783)*
>
> *at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.complete(NameNodeRpcServer.java:611)*
>
> *at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.complete(ClientNamenodeProtocolServerSideTranslatorPB.java:428)*
>
> *at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59586)*
>
> *at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)*
>
> *at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)*
>
> *at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)*
>
> *at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)*
>
> *at java.security.AccessController.doPrivileged(AccessController.java:310)*
>
> *at javax.security.auth.Subject.doAs(Subject.java:573)*
>
> *at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1502)*
>
> *at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)*
>
> *at org.apache.hadoop.ipc.Client.call(Client.java:1347)*
>
> *at org.apache.hadoop.ipc.Client.call(Client.java:1300)*
>
> *at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)*
>
> *at $Proxy7.complete(Unknown Source)*
>
> *at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>
> *at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)*
>
> *at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)*
>
> at java.lang.reflect.Method.invoke(Method.java:611)
>
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:186)
>
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:102)
>
> at $Proxy7.complete(Unknown Source)
>
> at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslat
> orPB.complete(ClientNamenodeProtocolTranslatorPB.java:371)
>
> at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(
> DFSOutputStream.java:1894)
>
> at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:1881)
>
> at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(
> FSDataOutputStream.java:71)
>
> at org.apache.hadoop.fs.FSDataOutputStream.close(
> FSDataOutputStream.java:104)
>
> at java.io.FilterOutputStream.close(FilterOutputStream.java:154)
>
> Any help will be appreciated!
>
> --
> Best Regards,
> Jian Feng
>

Reply via email to