Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2014-04-21 20:33:57,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:40030, bytes: 396288, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 64343552, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_-7583437257645631863_201976, duration: 388542879
- 2014-04-21 20:33:57,937 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Exiting DataBlockScanner thread.
- 2014-04-21 20:33:59,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Finishing DataNode in: FSDataset{dirpath='/home/hadoop/dfsdir/hadoop-hadoop/dfs/data/current'}
- 2014-04-21 20:33:59,163 INFO org.mortbay.log: Stopped SelectChannelConnector@0.0.0.0:50075
- 2014-04-21 20:33:59,264 INFO org.apache.hadoop.ipc.Server: Stopping server on 50020
- 2014-04-21 20:33:59,264 INFO org.apache.hadoop.ipc.Server: IPC Server handler 98 on 50020: exiting
- 2014-04-21 20:33:59,264 INFO org.apache.hadoop.ipc.Server: IPC Server handler 96 on 50020: exiting
- 2014-04-21 20:33:59,264 INFO org.apache.hadoop.ipc.Server: IPC Server handler 95 on 50020: exiting
- 2014-04-21 20:33:59,265 INFO org.apache.hadoop.ipc.Server: IPC Server handler 97 on 50020: exiting
- 2014-04-21 20:33:59,265 INFO org.apache.hadoop.ipc.Server: IPC Server handler 99 on 50020: exiting
- 2014-04-21 20:33:59,265 INFO org.apache.hadoop.ipc.Server: IPC Server handler 93 on 50020: exiting
- 2014-04-21 20:33:59,265 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 50020: exiting
- 2014-04-21 20:33:59,265 INFO org.apache.hadoop.ipc.Server: IPC Server handler 42 on 50020: exiting
- 2014-04-21 20:33:59,265 INFO org.apache.hadoop.ipc.Server: IPC Server handler 44 on 50020: exiting
- 2014-04-21 20:33:59,266 INFO org.apache.hadoop.ipc.Server: IPC Server handler 94 on 50020: exiting
- 2014-04-21 20:33:59,266 INFO org.apache.hadoop.ipc.Server: IPC Server handler 58 on 50020: exiting
- 2014-04-21 20:33:59,266 INFO org.apache.hadoop.ipc.Server: IPC Server handler 43 on 50020: exiting
- 2014-04-21 20:33:59,266 INFO org.apache.hadoop.ipc.Server: IPC Server handler 41 on 50020: exiting
- 2014-04-21 20:33:59,266 INFO org.apache.hadoop.ipc.Server: IPC Server handler 50 on 50020: exiting
- 2014-04-21 20:33:59,266 INFO org.apache.hadoop.ipc.Server: IPC Server handler 49 on 50020: exiting
- 2014-04-21 20:33:59,266 INFO org.apache.hadoop.ipc.Server: IPC Server handler 51 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 52 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 59 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 61 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 60 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 64 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 46 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 53 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 66 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 92 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 63 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 56 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 62 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 47 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 45 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 73 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 69 on 50020: exiting
- 2014-04-21 20:33:59,267 INFO org.apache.hadoop.ipc.Server: IPC Server handler 57 on 50020: exiting
- 2014-04-21 20:33:59,268 INFO org.apache.hadoop.ipc.Server: IPC Server handler 83 on 50020: exiting
- 2014-04-21 20:33:59,269 INFO org.apache.hadoop.ipc.Server: IPC Server handler 71 on 50020: exiting
- 2014-04-21 20:33:59,269 INFO org.apache.hadoop.ipc.Server: IPC Server handler 65 on 50020: exiting
- 2014-04-21 20:33:59,269 INFO org.apache.hadoop.ipc.Server: IPC Server handler 75 on 50020: exiting
- 2014-04-21 20:33:59,269 INFO org.apache.hadoop.ipc.Server: IPC Server handler 48 on 50020: exiting
- 2014-04-21 20:33:59,269 INFO org.apache.hadoop.ipc.Server: IPC Server handler 67 on 50020: exiting
- 2014-04-21 20:33:59,269 INFO org.apache.hadoop.ipc.Server: IPC Server handler 76 on 50020: exiting
- 2014-04-21 20:33:59,269 INFO org.apache.hadoop.ipc.Server: IPC Server handler 68 on 50020: exiting
- 2014-04-21 20:33:59,270 INFO org.apache.hadoop.ipc.Server: IPC Server handler 70 on 50020: exiting
- 2014-04-21 20:33:59,270 INFO org.apache.hadoop.ipc.Server: IPC Server handler 74 on 50020: exiting
- 2014-04-21 20:33:59,270 INFO org.apache.hadoop.ipc.Server: IPC Server handler 72 on 50020: exiting
- 2014-04-21 20:33:59,270 INFO org.apache.hadoop.ipc.Server: IPC Server handler 54 on 50020: exiting
- 2014-04-21 20:33:59,270 INFO org.apache.hadoop.ipc.Server: IPC Server handler 84 on 50020: exiting
- 2014-04-21 20:33:59,270 INFO org.apache.hadoop.ipc.Server: IPC Server handler 78 on 50020: exiting
- 2014-04-21 20:33:59,270 INFO org.apache.hadoop.ipc.Server: IPC Server handler 77 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 82 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 91 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 55 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 79 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 80 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 81 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 88 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 89 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 90 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 87 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 85 on 50020: exiting
- 2014-04-21 20:33:59,271 INFO org.apache.hadoop.ipc.Server: IPC Server handler 86 on 50020: exiting
- 2014-04-21 20:33:59,274 INFO org.apache.hadoop.ipc.Server: IPC Server handler 28 on 50020: exiting
- 2014-04-21 20:33:59,274 INFO org.apache.hadoop.ipc.Server: IPC Server handler 27 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 35 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 29 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 24 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 38 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 34 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 36 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 30 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 20 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 25 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 18 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 39 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 33 on 50020: exiting
- 2014-04-21 20:33:59,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 37 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 26 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 40 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 21 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 32 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 22 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 23 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 31 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 19 on 50020: exiting
- 2014-04-21 20:33:59,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 50020: exiting
- 2014-04-21 20:33:59,281 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 50020: exiting
- 2014-04-21 20:33:59,282 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 50020: exiting
- 2014-04-21 20:33:59,288 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 50020: exiting
- 2014-04-21 20:33:59,290 INFO org.apache.hadoop.ipc.metrics.RpcInstrumentation: shut down
- 2014-04-21 20:33:59,290 INFO org.apache.hadoop.ipc.Server: Stopping IPC Server Responder
- 2014-04-21 20:33:59,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Waiting for threadgroup to exit, active threads is 12
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:55921, bytes: 396288, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 3016704, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_3871185520387912910_202079, duration: 262378069440
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:43872, bytes: 792576, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 10899968, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_-6046900746056663146_201678, duration: 422235830892
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_-8192027677874709918_202165 java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_5571989386432628923_202164 java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_-8192027677874709918_202165 received exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5571989386432628923_202164 received exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,291 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_-6046900746056663146_201678 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 58279 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,291 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 58279 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_8522500554444365991_202108 java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_8522500554444365991_202108 received exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,305 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 50020: exiting
- 2014-04-21 20:33:59,305 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
- at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
- at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
- at java.io.DataInputStream.read(DataInputStream.java:149)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:312)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:376)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:532)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:398)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:107)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,305 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 50020: exiting
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:42364, bytes: 1320960, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 24394240, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_-3389679390839528994_202012, duration: 428336849389
- 2014-04-21 20:33:59,305 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 50020: exiting
- 2014-04-21 20:33:59,305 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 50020: exiting
- 2014-04-21 20:33:59,305 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_-3389679390839528994_202012 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 51789 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,305 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 51789 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,305 INFO org.apache.hadoop.ipc.Server: IPC Server handler 10 on 50020: exiting
- 2014-04-21 20:33:59,306 INFO org.apache.hadoop.ipc.Server: IPC Server handler 11 on 50020: exiting
- 2014-04-21 20:33:59,305 INFO org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 50020
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:36414, bytes: 396288, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 3738624, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_8924402542700605951_202124, duration: 13169565766
- 2014-04-21 20:33:59,306 INFO org.apache.hadoop.ipc.Server: IPC Server handler 12 on 50020: exiting
- 2014-04-21 20:33:59,306 INFO org.apache.hadoop.ipc.Server: IPC Server handler 14 on 50020: exiting
- 2014-04-21 20:33:59,306 INFO org.apache.hadoop.ipc.Server: IPC Server handler 15 on 50020: exiting
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:50137, bytes: 396288, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 25057280, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_5559451839447260838_202033, duration: 398200302864
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:42385, bytes: 1188864, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 65782784, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_-5065679070635531116_201679, duration: 428246087054
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:50130, bytes: 1453056, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 33374208, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_5966484054166966561_201940, duration: 398209370280
- 2014-04-21 20:33:59,291 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
- at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
- at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
- at java.io.DataInputStream.read(DataInputStream.java:149)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:312)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:376)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:532)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:398)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:107)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.10.45:50010, dest: /192.168.10.45:60253, bytes: 396288, op: HDFS_READ, cliID: DFSClient_hb_rs_app-hbase-1,60020,1392084194869, offset: 5377536, srvID: DS-1676697306-192.168.10.45-50010-1392029190949, blockid: blk_3395514677319327012_202081, duration: 251187021053
- 2014-04-21 20:33:59,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_5559451839447260838_202033 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 81811 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,307 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 81811 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,291 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_3871185520387912910_202079 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 217622 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,291 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
- at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
- at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
- at java.io.DataInputStream.read(DataInputStream.java:149)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:312)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:376)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:532)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:398)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:107)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_-7969006819959471805_202154 java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,308 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 217622 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,308 INFO org.apache.hadoop.ipc.Server: IPC Server handler 16 on 50020: exiting
- 2014-04-21 20:33:59,308 INFO org.apache.hadoop.ipc.Server: IPC Server handler 17 on 50020: exiting
- 2014-04-21 20:33:59,308 INFO org.apache.hadoop.ipc.Server: IPC Server handler 13 on 50020: exiting
- 2014-04-21 20:33:59,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_5966484054166966561_201940 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 81862 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_3395514677319327012_202081 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 228813 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_-5065679070635531116_201679 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 51826 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,309 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 51826 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,306 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):Got exception while serving blk_8924402542700605951_202124 to /192.168.10.45:
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 466924 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,309 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 228813 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,309 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 81862 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_-7969006819959471805_202154 received exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- 2014-04-21 20:33:59,310 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
- at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
- at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
- at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
- at java.io.DataInputStream.read(DataInputStream.java:149)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:312)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:376)
- at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:532)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:398)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:107)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:33:59,310 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.10.45:50010, storageID=DS-1676697306-192.168.10.45-50010-1392029190949, infoPort=50075, ipcPort=50020):DataXceiver
- java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[closed]. 466924 millis timeout left.
- at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
- at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:245)
- at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:159)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:198)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:350)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:436)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:197)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:99)
- at java.lang.Thread.run(Thread.java:722)
- 2014-04-21 20:34:00,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
- 2014-04-21 20:34:00,404 INFO org.apache.hadoop.hdfs.server.datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
- 2014-04-21 20:34:00,405 INFO org.apache.hadoop.hdfs.server.datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
- 2014-04-21 20:34:00,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode
- 2014-04-21 20:34:00,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
- /************************************************************
- SHUTDOWN_MSG: Shutting down DataNode at app-hbase-1/192.168.10.45
- ************************************************************/
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement