Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2015-07-20 16:29:26,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
- 2015-07-20 16:29:26,945 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/w13/hdfs should be specified as a URI in configuration files. Please update hdfs configuration.
- 2015-07-20 16:29:27,230 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
- 2015-07-20 16:29:27,281 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
- 2015-07-20 16:29:27,281 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
- 2015-07-20 16:29:27,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is PLAN-A.UNI-MUENSTER.DE
- 2015-07-20 16:29:27,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
- 2015-07-20 16:29:27,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
- 2015-07-20 16:29:27,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 1048576 bytes/s
- 2015-07-20 16:29:27,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 5
- 2015-07-20 16:29:27,366 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
- 2015-07-20 16:29:27,369 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
- 2015-07-20 16:29:27,376 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
- 2015-07-20 16:29:27,377 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
- 2015-07-20 16:29:27,378 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
- 2015-07-20 16:29:27,378 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
- 2015-07-20 16:29:27,388 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.datanode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
- 2015-07-20 16:29:27,390 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50075
- 2015-07-20 16:29:27,390 INFO org.mortbay.log: jetty-6.1.26
- 2015-07-20 16:29:27,552 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50075
- 2015-07-20 16:29:27,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = w13
- 2015-07-20 16:29:27,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
- 2015-07-20 16:29:27,682 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
- 2015-07-20 16:29:27,693 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
- 2015-07-20 16:29:27,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:50020
- 2015-07-20 16:29:27,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
- 2015-07-20 16:29:27,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default>
- 2015-07-20 16:29:27,744 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/w13/hdfs should be specified as a URI in configuration files. Please update hdfs configuration.
- 2015-07-20 16:29:27,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to a/192.168.0.1:9000 starting to offer service
- 2015-07-20 16:29:27,762 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
- 2015-07-20 16:29:27,762 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 50020: starting
- 2015-07-20 16:29:27,902 INFO org.apache.hadoop.hdfs.server.common.Storage: DataNode version: -56 and NameNode layout version: -60
- 2015-07-20 16:29:27,935 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/w13/hdfs/in_use.lock acquired by nodename 16173@PLAN-A.UNI-MUENSTER.DE
- 2015-07-20 16:29:28,007 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1637439489-128.176.144.92-1437398004886
- 2015-07-20 16:29:28,007 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled
- 2015-07-20 16:29:28,008 INFO org.apache.hadoop.hdfs.server.common.Storage: Restored 0 block files from trash.
- 2015-07-20 16:29:28,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=782468146;bpid=BP-1637439489-128.176.144.92-1437398004886;lv=-56;nsInfo=lv=-60;cid=CID-5719f036-569f-40d5-82bd-7e4cc3af2de4;nsid=782468146;c=0;bpid=BP-1637439489-128.176.144.92-1437398004886;dnuuid=8e82d7d6-b6a1-4edd-ab32-03253ac50967
- 2015-07-20 16:29:28,035 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/w13/hdfs should be specified as a URI in configuration files. Please update hdfs configuration.
- 2015-07-20 16:29:28,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: /home/w13/hdfs/current
- 2015-07-20 16:29:28,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - /home/w13/hdfs/current, StorageType: DISK
- 2015-07-20 16:29:28,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean
- 2015-07-20 16:29:28,091 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 1437407500091 with interval 21600000
- 2015-07-20 16:29:28,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-1637439489-128.176.144.92-1437398004886
- 2015-07-20 16:29:28,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1637439489-128.176.144.92-1437398004886 on volume /home/w13/hdfs/current...
- 2015-07-20 16:29:28,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /home/w13/hdfs/current/BP-1637439489-128.176.144.92-1437398004886/current: 7979216896
- 2015-07-20 16:29:28,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1637439489-128.176.144.92-1437398004886 on /home/w13/hdfs/current: 6ms
- 2015-07-20 16:29:28,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1637439489-128.176.144.92-1437398004886: 6ms
- 2015-07-20 16:29:28,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1637439489-128.176.144.92-1437398004886 on volume /home/w13/hdfs/current...
- 2015-07-20 16:29:28,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1637439489-128.176.144.92-1437398004886 on volume /home/w13/hdfs/current: 9ms
- 2015-07-20 16:29:28,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map: 10ms
- 2015-07-20 16:29:28,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid null) service to a/192.168.0.1:9000 beginning handshake with NN
- 2015-07-20 16:29:28,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid null) service to a/192.168.0.1:9000 successfully registered with NN
- 2015-07-20 16:29:28,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode a/192.168.0.1:9000 using DELETEREPORT_INTERVAL of 300000 msec BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
- 2015-07-20 16:29:28,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Namenode Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid 8e82d7d6-b6a1-4edd-ab32-03253ac50967) service to a/192.168.0.1:9000 trying to claim ACTIVE state with txid=694
- 2015-07-20 16:29:28,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode Block pool BP-1637439489-128.176.144.92-1437398004886 (Datanode Uuid 8e82d7d6-b6a1-4edd-ab32-03253ac50967) service to a/192.168.0.1:9000
- 2015-07-20 16:29:28,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Sent 1 blockreports 71 blocks total. Took 2 msec to generate and 20 msecs for RPC and NN processing. Got back commands org.apache.hadoop.hdfs.server.protocol.FinalizeCommand@39253c7f
- 2015-07-20 16:29:28,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1637439489-128.176.144.92-1437398004886
- 2015-07-20 16:29:28,174 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlockMap
- 2015-07-20 16:29:28,174 INFO org.apache.hadoop.util.GSet: VM type = 64-bit
- 2015-07-20 16:29:28,175 INFO org.apache.hadoop.util.GSet: 0.5% max memory 889 MB = 4.4 MB
- 2015-07-20 16:29:28,175 INFO org.apache.hadoop.util.GSet: capacity = 2^19 = 524288 entries
- 2015-07-20 16:29:28,176 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Periodic Block Verification Scanner initialized with interval 504 hours for block pool BP-1637439489-128.176.144.92-1437398004886
- 2015-07-20 16:29:28,180 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Added bpid=BP-1637439489-128.176.144.92-1437398004886 to blockPoolScannerMap, new size=1
- 2015-07-20 16:30:10,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741953_1129 src: /192.168.0.1:56385 dest: /192.168.0.1:50010
- 2015-07-20 16:30:10,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56385, dest: /192.168.0.1:50010, bytes: 236, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741953_1129, duration: 34157479
- 2015-07-20 16:30:10,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741953_1129, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741954_1130 src: /192.168.0.1:56387 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56387, dest: /192.168.0.1:50010, bytes: 1027, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741954_1130, duration: 38575119
- 2015-07-20 16:30:11,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741954_1130, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131 src: /192.168.0.1:56389 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56389, dest: /192.168.0.1:50010, bytes: 436302, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131, duration: 26190548
- 2015-07-20 16:30:11,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741956_1132 src: /192.168.0.1:56391 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56391, dest: /192.168.0.1:50010, bytes: 180738, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741956_1132, duration: 9305411
- 2015-07-20 16:30:11,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741956_1132, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741957_1133 src: /192.168.0.1:56393 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56393, dest: /192.168.0.1:50010, bytes: 436302, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741957_1133, duration: 10486899
- 2015-07-20 16:30:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741957_1133, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741958_1134 src: /192.168.0.1:56395 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56395, dest: /192.168.0.1:50010, bytes: 180738, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741958_1134, duration: 12085651
- 2015-07-20 16:30:11,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741958_1134, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135 src: /192.168.0.1:56398 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56398, dest: /192.168.0.1:50010, bytes: 104587, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135, duration: 6538199
- 2015-07-20 16:30:11,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741960_1136 src: /192.168.0.1:56400 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56400, dest: /192.168.0.1:50010, bytes: 1162, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741960_1136, duration: 9468085
- 2015-07-20 16:30:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741960_1136, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:11,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741961_1137 src: /192.168.0.1:56402 dest: /192.168.0.1:50010
- 2015-07-20 16:30:11,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56402, dest: /192.168.0.1:50010, bytes: 544, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741961_1137, duration: 4387976
- 2015-07-20 16:30:11,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741961_1137, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:12,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1637439489-128.176.144.92-1437398004886:blk_1073741962_1138 src: /192.168.0.1:56405 dest: /192.168.0.1:50010
- 2015-07-20 16:30:12,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.0.1:56405, dest: /192.168.0.1:50010, bytes: 90876, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939189663_1, offset: 0, srvID: 8e82d7d6-b6a1-4edd-ab32-03253ac50967, blockid: BP-1637439489-128.176.144.92-1437398004886:blk_1073741962_1138, duration: 8642087
- 2015-07-20 16:30:12,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1637439489-128.176.144.92-1437398004886:blk_1073741962_1138, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
- 2015-07-20 16:30:13,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.0.1, datanodeUuid=8e82d7d6-b6a1-4edd-ab32-03253ac50967, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-5719f036-569f-40d5-82bd-7e4cc3af2de4;nsid=782468146;c=0) Starting thread to transfer BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131 to 192.168.0.8:50010 192.168.0.4:50010
- 2015-07-20 16:30:13,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(192.168.0.1, datanodeUuid=8e82d7d6-b6a1-4edd-ab32-03253ac50967, infoPort=50075, ipcPort=50020, storageInfo=lv=-56;cid=CID-5719f036-569f-40d5-82bd-7e4cc3af2de4;nsid=782468146;c=0) Starting thread to transfer BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135 to 192.168.0.8:50010 192.168.0.4:50010
- 2015-07-20 16:30:13,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DataTransfer: Transmitted BP-1637439489-128.176.144.92-1437398004886:blk_1073741955_1131 (numBytes=436302) to /192.168.0.8:50010
- 2015-07-20 16:30:13,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DataTransfer: Transmitted BP-1637439489-128.176.144.92-1437398004886:blk_1073741959_1135 (numBytes=104587) to /192.168.0.8:50010
- 2015-07-20 16:30:23,164 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
- java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
- at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
- at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
- at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
- at java.lang.Thread.run(Thread.java:745)
- 2015-07-20 16:30:23,224 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
- java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
- at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
- at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
- at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
- at java.lang.Thread.run(Thread.java:745)
- 2015-07-20 16:30:23,689 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
- java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
- at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
- at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
- at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
- at java.lang.Thread.run(Thread.java:745)
- 2015-07-20 16:30:23,778 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
- java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
- at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
- at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
- at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
- at java.lang.Thread.run(Thread.java:745)
- 2015-07-20 16:31:40,530 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741877_1053
- 2015-07-20 16:33:03,982 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
- java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
- at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
- at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
- at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
- at java.lang.Thread.run(Thread.java:745)
- 2015-07-20 16:33:04,017 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
- java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
- at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
- at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
- at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
- at java.lang.Thread.run(Thread.java:745)
- 2015-07-20 16:33:49,730 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741859_1035
- 2015-07-20 16:35:19,347 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: BlockSender.sendChunks() exception:
- java.io.IOException: Die Verbindung wurde vom Kommunikationspartner zurückgesetzt
- at sun.nio.ch.FileChannelImpl.transferTo0(Native Method)
- at sun.nio.ch.FileChannelImpl.transferToDirectly(FileChannelImpl.java:433)
- at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:565)
- at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:559)
- at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:728)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:496)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:116)
- at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
- at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
- at java.lang.Thread.run(Thread.java:745)
- 2015-07-20 16:35:58,730 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741828_1004
- 2015-07-20 16:38:07,930 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741842_1018
- 2015-07-20 16:40:17,130 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741876_1052
- 2015-07-20 16:42:26,130 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741882_1058
- 2015-07-20 16:43:18,730 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741839_1015
- 2015-07-20 16:43:18,731 INFO org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceScanner: Verification succeeded for BP-1637439489-128.176.144.92-1437398004886:blk_1073741950_1126
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement