I'm running flink job in the hadoop yarn clusters, and I found something wrong 
and the flink job failed.
in the dataNode log i found the error info below:


2019-11-05 07:15:30,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
opWriteBlock BP-2071346120-10.25.70.111-1572862875273:blk_1073775555_34731 
received exception java.io.IOException: Premature EOF from inputStream
2019-11-05 07:15:30,463 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: 
datanode2:50010:DataXceiver error processing WRITE_BLOCK operation  src: 
/10.26.133.116:36274 dst: /10.26.133.113:50010
java.io.IOException: Premature EOF from inputStream
    at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:211)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:211)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
    at 
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:528)
    at 
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:968)
    at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:867)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:166)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:103)
    at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:288)
    at java.lang.Thread.run(Thread.java:748)
2019-11-05 07:15:30,463 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: 
datanode2:50010:DataXceiver error processing WRITE_BLOCK operation  src: 
/10.26.133.116:35736 dst: /10.26.133.113:50010
java.io.InterruptedIOException: Interrupted while waiting for IO on channel 
java.nio.channels.SocketChannel[connected local=/10.26.133.113:36864 
remote=/10.26.130.117:50010]. 485000 millis timeout left.
    at 
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:342)
    at 
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
    at 
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159)
    at 
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117)
    at 
java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
    at java.io.DataOutputStream.write(DataOutputStream.java:107)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.mirrorPacketTo(PacketReceiver.java:198)
    at 
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:588)
    at 
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:968)
    at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:867)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:166)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:103)
    at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:288)
    at java.lang.Thread.run(Thread.java:748)









2019-11-05 07:15:30,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
opWriteBlock BP-2071346120-10.25.70.111-1572862875273:blk_1073775652_34828 
received exception java.io.IOException: Connection reset by peer
2019-11-05 07:15:30,584 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: 
datanode2:50010:DataXceiver error processing WRITE_BLOCK operation  src: 
/10.26.133.112:35622 dst: /10.26.133.113:50010
java.io.IOException: Connection reset by peer
    at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
    at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
    at sun.nio.ch.IOUtil.read(IOUtil.java:197)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
    at 
org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)
    at 
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
    at 
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
    at 
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
    at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
    at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
    at java.io.DataInputStream.read(DataInputStream.java:149)
    at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:209)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:211)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
    at 
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:528)
    at 
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:968)
    at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:867)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:166)
    at 
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:103)
    at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:288)
    at java.lang.Thread.run(Thread.java:748)





can anyone help me 
Thanks 

Reply via email to