Advertisement
Guest User

Untitled

a guest
Feb 26th, 2014
84
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. 2014-02-21 13:38:25,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: NameNode calls recoverBlock(block=blk_-6695300470410774365_837638, targets=[10.0.0.151:50010, 10.0.0.108:50010, 10.0.0.96:50010])
  2. 2014-02-21 13:38:26,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_3396426774893476207_837791 src: /10.0.0.91:48688 dest: /10.0.0.151:50010
  3. 2014-02-21 13:38:26,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_-6695300470410774365_837638(length=50386058), newblock=blk_-6695300470410774365_837793(length=50386058), datanode=10.0.0.151:50010
  4.  
  5.  
  6. .
  7. .
  8. .
  9.  
  10.  
  11. 2014-02-21 13:39:27,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_-6695300470410774365_837638(length=50386058), newblock=blk_-6695300470410774365_837801(length=50386058), datanode=10.0.0.151:50010
  12. 2014-02-21 13:40:27,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_-6695300470410774365_837638(length=50386058), newblock=blk_-6695300470410774365_837802(length=50386058), datanode=10.0.0.151:50010
  13. 2014-02-21 13:40:39,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /10.0.0.91:48823, dest: /10.0.0.151:50010, bytes: 67108864, op: HDFS_WRITE, cliID: DFSClient_205598139, srvID: DS-1932679773-10.0.0.151-50010-1342621185881, blockid: blk_-4032627633308718992_837800
  14. 2014-02-21 13:40:39,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 1 for block blk_-4032627633308718992_837800 terminating
  15. 2014-02-21 13:41:00,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_-6695300470410774365_837638 java.nio.channels.ClosedByInterruptException
  16. 2014-02-21 13:41:00,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_-6695300470410774365_837638 received exception java.io.IOException: Interrupted receiveBlock
  17. 2014-02-21 13:41:00,273 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.0.0.151:50010, storageID=DS-1932679773-10.0.0.151-50010-1342621185881, infoPort=50075, ipcPort=50020):DataXceiver
  18. java.io.IOException: Interrupted receiveBlock
  19. at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:578)
  20. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:358)
  21. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
  22. at java.lang.Thread.run(Thread.java:662)
  23. 2014-02-21 13:41:00,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_-6695300470410774365_837638 2 : Thread is interrupted.
  24. 2014-02-21 13:41:00,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 2 for block blk_-6695300470410774365_837638 terminating
  25. 2014-02-21 13:41:00,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Received block blk_-6695300470410774365_837802 of size 50386058 as part of lease recovery.
  26. 2014-02-21 13:41:00,275 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 50020, call updateBlock(blk_-6695300470410774365_837638, blk_-6695300470410774365_837801, true) from 10.0.0.108:40317: error: java.io.IOException: Cannot update block (id=-6695300470410774365) generation stamp from 837802 to 837801
  27. java.io.IOException: Cannot update block (id=-6695300470410774365) generation stamp from 837802 to 837801
  28. at org.apache.hadoop.hdfs.server.datanode.FSDataset.tryUpdateBlock(FSDataset.java:1098)
  29. at org.apache.hadoop.hdfs.server.datanode.FSDataset.updateBlock(FSDataset.java:1034)
  30. at org.apache.hadoop.hdfs.server.datanode.DataNode.updateBlock(DataNode.java:1453)
  31. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  32. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
  33. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
  34. at java.lang.reflect.Method.invoke(Method.java:597)
  35. at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
  36. at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:961)
  37. at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:957)
  38. at java.security.AccessController.doPrivileged(Native Method)
  39. at javax.security.auth.Subject.doAs(Subject.java:396)
  40. at org.apache.hadoop.ipc.Server$Handler.run(Server.java:955)
  41. 2014-02-21 13:41:00,275 WARN org.apache.hadoop.hdfs.server.protocol.InterDatanodeProtocol: Failed to updateBlock (newblock=blk_-6695300470410774365_837793, datanode=10.0.0.151:50010)
  42. java.io.IOException: Meta file not found, blockFile=/mnt/hadoop/hadoopdata/hdfs/data/current/subdir3/subdir29/blk_-6695300470410774365
  43. at org.apache.hadoop.hdfs.server.datanode.FSDataset.findMetaFile(FSDataset.java:771)
  44. at org.apache.hadoop.hdfs.server.datanode.FSDataset.tryUpdateBlock(FSDataset.java:1086)
  45. at org.apache.hadoop.hdfs.server.datanode.FSDataset.updateBlock(FSDataset.java:1034)
  46. at org.apache.hadoop.hdfs.server.datanode.DataNode.updateBlock(DataNode.java:1453)
  47. at org.apache.hadoop.hdfs.server.datanode.DataNode.syncBlock(DataNode.java:1582)
  48. at org.apache.hadoop.hdfs.server.datanode.DataNode.recoverBlock(DataNode.java:1551)
  49. at org.apache.hadoop.hdfs.server.datanode.DataNode.access$100(DataNode.java:127)
  50. at org.apache.hadoop.hdfs.server.datanode.DataNode$1.run(DataNode.java:1437)
  51. at java.lang.Thread.run(Thread.java:662)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement