Advertisement
Guest User

Untitled

a guest
Jul 20th, 2015
238
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 27.02 KB | None | 0 0
  1. 2015-07-20 16:29:26,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
  2. 2015-07-20 16:29:26,945 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/w13/hdfs should be specified as a URI in configuration files. Please update hdfs configuration.
  3. 2015-07-20 16:29:27,230 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
  4. 2015-07-20 16:29:27,281 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
  5. 2015-07-20 16:29:27,281 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
  6. 2015-07-20 16:29:27,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is PLAN-A.UNI-MUENSTER.DE
  7. 2015-07-20 16:29:27,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
  8. 2015-07-20 16:29:27,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
  9. 2015-07-20 16:29:27,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 1048576 bytes/s
  10. 2015-07-20 16:29:27,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 5
  11. 2015-07-20 16:29:27,366 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
  12. 2015-07-20 16:29:27,369 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
  13. 2015-07-20 16:29:27,376 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
  14. 2015-07-20 16:29:27,377 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
  15. 2015-07-20 16:29:27,378 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
  16. 2015-07-20 16:29:27,378 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
  17. 2015-07-20 16:29:27,388 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.datanode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
  18. 2015-07-20 16:29:27,390 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50075
  19. 2015-07-20 16:29:27,390 INFO org.mortbay.log: jetty-6.1.26
  20. 2015-07-20 16:29:27,552 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50075
  21. 2015-07-20 16:29:27,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = w13
  22. 2015-07-20 16:29:27,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
  23. 2015-07-20 16:29:27,682 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
  24. 2015-07-20 16:29:27,693 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
  25. 2015-07-20 16:29:27,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:50020
  26. 2015-07-20 16:29:27,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
  27. 2015-07-20 16:29:27,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default>
  28. 2015-07-20 16:29:27,744 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/w13/hdfs should be specified as a URI in configuration files. Please update hdfs configuration.
  29. 2015-07-20 16:29:27,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to a/192.168.0.1:9000 starting to offer service
  30. 2015-07-20 16:29:27,762 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
  31. 2015-07-20 16:29:27,762 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 50020: starting
  32. 2015-07-20 16:29:27,902 INFO org.apache.hadoop.hdfs.server.common.Storage: DataNode version: -56 and NameNode layout version: -60
  33. 2015-07-20 16:29:27,935 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/w13/hdfs/in_use.lock acquired by nodename 16173@PLAN-A.UNI-MUENSTER.DE
  34. 2015-07-20 16:29:28,007 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1637439489-128.176.144.92-1437398004886
  35. 2015-07-20 16:29:28,007 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled
  36. 2015-07-20 16:29:28,008 INFO org.apache.hadoop.hdfs.server.common.Storage: Restored 0 block files from trash.
  37. 2015-07-20 16:29:28,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=782468146;bpid=BP-1637439489-128.176.144.92-1437398004886;lv=-56;nsInfo=lv=-60;cid=CID-5719f036-569f-40d5-82bd-7e4cc3af2de4;nsid=782468146;c=0;bpid=BP-1637439489-128.176.144.92-1437398004886;dnuuid=8e82d7d6-b6a1-4edd-ab32-03253ac50967
  38. 2015-07-20 16:29:28,035 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/w13/hdfs should be specified as a URI in configuration files. Please update hdfs configuration.
  39. 2015-07-20 16:29:28,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: /home/w13/hdfs/current
  40. 2015-07-20 16:29:28,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - /home/w13/hdfs/current, StorageType: DISK
  41. 2015-07-20 16:29:28,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean
  42. 2015-07-20 16:29:28,091 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 1437407500091 with interval 21600000
  43. 2015-07-20 16:29:28,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-1637439489-128.176.144.92-1437398004886
  44. 2015-07-20 16:29:28,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1637439489-128.176.144.92-1437398004886 on volume /home/w13/hdfs/current...
  45. 2015-07-20 16:29:28,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /home/w13/hdfs/current/BP-1637439489-128.176.144.92-1437398004886/current: 7979216896
  46. 2015-07-20 16:29:28,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1637439489-128.176.144.92-1437398004886 on /home/w13/hdfs/current: 6ms
  47. 2015-07-20 16:29:28,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1637439489-128.176.144.92-1437398004886: 6ms
  48. 2015-07-20 16:29:28,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1637439489-128.176.144.92-1437398004886 on volume /home/w13/hdfs/current...
  49. 2015-07-20 16:29:28,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1637439489-128.176.144.92-1437398004886 on volume /home/w13/hdfs/current: 9ms
  50. 2015-07-20 16:29:28,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map: 10ms
  51. 2015-07-20 16:29:28,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid null) service to a/192.168.0.1:9000 beginning handshake with NN
  52. 2015-07-20 16:29:28,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid null) service to a/192.168.0.1:9000 successfully registered with NN
  53. 2015-07-20 16:29:28,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode a/192.168.0.1:9000 using DELETEREPORT_INTERVAL of 300000 msec BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
  54. 2015-07-20 16:29:28,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Namenode Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid 8e82d7d6-b6a1-4edd-ab32-03253ac50967) service to a/192.168.0.1:9000 trying to claim ACTIVE state with txid=694
  55. 2015-07-20 16:29:28,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid 8e82d7d6-b6a1-4edd-ab32-03253ac50967) service to a/192.168.0.1:9000
  56. 2015-07-20 16:29:28,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Sent 1 blockreports 71 blocks total. Took 2 msec to generate and 20 msecs for RPC and NN processing. Got back commands org.apache.hadoop.hdfs.server.protocol.FinalizeCommand@39253c7f
  57. 2015-07-20 16:29:28,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1637439489-128.176.144.92-1437398004886
  58. 2015-07-20 16:29:28,174 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlockMap
  59. 2015-07-20 16:29:28,174 INFO org.apache.hadoop.util.GSet: VM type = 64-bit
  60. 2015-07-20 16:29:28,175 INFO org.apache.hadoop.util.GSet: 0.5% max memory 889 MB = 4.4 MB
  61. 2015-07-20 16:29:28,175 INFO org.apache.hadoop.util.GSet: capacity = 2^19 = 524288 entries
  62. 2015-07-20 16:29:28,176 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Periodic Block Verification Scanner initialized with interval 504 hours for block pool BP-1637439489-128.176.144.92-1437398004886
  63. 2015-07-20 16:29:28,180 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Added bpid=BP-1637439489-128.176.144.92-1437398004886 to blockPoolScannerMap, new size=1
  64. 2015-07-20 16:30:10,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741953_1129 src: /192.168.0.1:56385 dest: /192.168.0.1:50010
  65. 2015-07-20 16:30:10,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56385, dest: /192.168.0.1:50010, bytes: 236, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741953_1129, duration: 34157479
  66. 2015-07-20 16:30:10,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741953_1129, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  67. 2015-07-20 16:30:11,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741954_1130 src: /192.168.0.1:56387 dest: /192.168.0.1:50010
  68. 2015-07-20 16:30:11,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56387, dest: /192.168.0.1:50010, bytes: 1027, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741954_1130, duration: 38575119
  69. 2015-07-20 16:30:11,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741954_1130, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  70. 2015-07-20 16:30:11,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131 src: /192.168.0.1:56389 dest: /192.168.0.1:50010
  71. 2015-07-20 16:30:11,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56389, dest: /192.168.0.1:50010, bytes: 436302, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131, duration: 26190548
  72. 2015-07-20 16:30:11,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  73. 2015-07-20 16:30:11,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741956_1132 src: /192.168.0.1:56391 dest: /192.168.0.1:50010
  74. 2015-07-20 16:30:11,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56391, dest: /192.168.0.1:50010, bytes: 180738, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741956_1132, duration: 9305411
  75. 2015-07-20 16:30:11,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741956_1132, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  76. 2015-07-20 16:30:11,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741957_1133 src: /192.168.0.1:56393 dest: /192.168.0.1:50010
  77. 2015-07-20 16:30:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56393, dest: /192.168.0.1:50010, bytes: 436302, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741957_1133, duration: 10486899
  78. 2015-07-20 16:30:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741957_1133, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  79. 2015-07-20 16:30:11,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741958_1134 src: /192.168.0.1:56395 dest: /192.168.0.1:50010
  80. 2015-07-20 16:30:11,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56395, dest: /192.168.0.1:50010, bytes: 180738, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741958_1134, duration: 12085651
  81. 2015-07-20 16:30:11,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741958_1134, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  82. 2015-07-20 16:30:11,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135 src: /192.168.0.1:56398 dest: /192.168.0.1:50010
  83. 2015-07-20 16:30:11,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56398, dest: /192.168.0.1:50010, bytes: 104587, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135, duration: 6538199
  84. 2015-07-20 16:30:11,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  85. 2015-07-20 16:30:11,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741960_1136 src: /192.168.0.1:56400 dest: /192.168.0.1:50010
  86. 2015-07-20 16:30:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56400, dest: /192.168.0.1:50010, bytes: 1162, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741960_1136, duration: 9468085
  87. 2015-07-20 16:30:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741960_1136, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  88. 2015-07-20 16:30:11,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741961_1137 src: /192.168.0.1:56402 dest: /192.168.0.1:50010
  89. 2015-07-20 16:30:11,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56402, dest: /192.168.0.1:50010, bytes: 544, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741961_1137, duration: 4387976
  90. 2015-07-20 16:30:11,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741961_1137, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  91. 2015-07-20 16:30:12,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741962_1138 src: /192.168.0.1:56405 dest: /192.168.0.1:50010
  92. 2015-07-20 16:30:12,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56405, dest: /192.168.0.1:50010, bytes: 90876, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741962_1138, duration: 8642087
  93. 2015-07-20 16:30:12,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741962_1138, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
  94. 2015-07-20 16:30:13,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.0.1, datanodeUuid=8e82d7d6-b6a1-4edd-ab32-03253ac50967, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-5719f036-569f-40d5-82bd-7e4cc3af2de4;nsid=782468146;c=0) Starting thread to transfer BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131 to 192.168.0.8:50010 192.168.0.4:50010
  95. 2015-07-20 16:30:13,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.0.1, datanodeUuid=8e82d7d6-b6a1-4edd-ab32-03253ac50967, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-5719f036-569f-40d5-82bd-7e4cc3af2de4;nsid=782468146;c=0) Starting thread to transfer BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135 to 192.168.0.8:50010 192.168.0.4:50010
  96. 2015-07-20 16:30:13,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DataTransfer: Transmitted BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131 (numBytes=436302) to /192.168.0.8:50010
  97. 2015-07-20 16:30:13,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DataTransfer: Transmitted BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135 (numBytes=104587) to /192.168.0.8:50010
  98. 2015-07-20 16:30:23,164 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
  99. java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
  100. at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
  101. at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
  102. at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
  103. at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
  104. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
  105. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
  106. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
  107. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
  108. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
  109. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
  110. at java.lang.Thread.run(Thread.java:745)
  111. 2015-07-20 16:30:23,224 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
  112. java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
  113. at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
  114. at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
  115. at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
  116. at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
  117. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
  118. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
  119. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
  120. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
  121. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
  122. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
  123. at java.lang.Thread.run(Thread.java:745)
  124. 2015-07-20 16:30:23,689 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
  125. java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
  126. at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
  127. at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
  128. at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
  129. at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
  130. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
  131. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
  132. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
  133. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
  134. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
  135. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
  136. at java.lang.Thread.run(Thread.java:745)
  137. 2015-07-20 16:30:23,778 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
  138. java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
  139. at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
  140. at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
  141. at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
  142. at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
  143. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
  144. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
  145. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
  146. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
  147. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
  148. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
  149. at java.lang.Thread.run(Thread.java:745)
  150. 2015-07-20 16:31:40,530 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741877_1053
  151. 2015-07-20 16:33:03,982 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
  152. java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
  153. at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
  154. at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
  155. at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
  156. at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
  157. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
  158. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
  159. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
  160. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
  161. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
  162. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
  163. at java.lang.Thread.run(Thread.java:745)
  164. 2015-07-20 16:33:04,017 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
  165. java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
  166. at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
  167. at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
  168. at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
  169. at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
  170. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
  171. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
  172. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
  173. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
  174. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
  175. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
  176. at java.lang.Thread.run(Thread.java:745)
  177. 2015-07-20 16:33:49,730 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741859_1035
  178. 2015-07-20 16:35:19,347 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
  179. java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
  180. at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
  181. at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
  182. at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
  183. at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
  184. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
  185. at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
  186. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
  187. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
  188. at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
  189. at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
  190. at java.lang.Thread.run(Thread.java:745)
  191. 2015-07-20 16:35:58,730 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741828_1004
  192. 2015-07-20 16:38:07,930 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741842_1018
  193. 2015-07-20 16:40:17,130 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741876_1052
  194. 2015-07-20 16:42:26,130 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741882_1058
  195. 2015-07-20 16:43:18,730 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741839_1015
  196. 2015-07-20 16:43:18,731 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741950_1126
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement