Advertisement
Guest User

Untitled

a guest
Apr 22nd, 2014
165
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 91.03 KB | None | 0 0
  1. 2014-04-21 20:32:42,125 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 29244193 synced till here 29244119
  2. 2014-04-21 20:32:42,142 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083514243, entries=234905, filesize=63811423. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083562116
  3. 2014-04-21 20:32:53,131 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.24 MB of total=2.31 GB
  4. 2014-04-21 20:32:53,172 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.28 MB, total=2.04 GB, single=409.15 MB, multi=1.89 GB, memory=0.75 KB
  5. 2014-04-21 20:33:03,989 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.27 MB of total=2.31 GB
  6. 2014-04-21 20:33:04,062 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.29 MB, total=2.04 GB, single=406.55 MB, multi=1.89 GB, memory=0.75 KB
  7. 2014-04-21 20:33:15,839 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  8. 2014-04-21 20:33:15,879 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.26 MB, total=2.04 GB, single=415.59 MB, multi=1.88 GB, memory=0.75 KB
  9. 2014-04-21 20:33:28,443 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.24 MB of total=2.31 GB
  10. 2014-04-21 20:33:28,481 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.29 MB, total=2.04 GB, single=408.95 MB, multi=1.89 GB, memory=0.75 KB
  11. 2014-04-21 20:33:31,133 DEBUG org.apache.hadoop.hbase.regionserver.LogRoller: HLog roll requested
  12. 2014-04-21 20:33:31,133 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  13. 2014-04-21 20:33:31,133 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  14. 2014-04-21 20:33:31,145 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: using new createWriter -- HADOOP-6840
  15. 2014-04-21 20:33:31,145 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: Path=hdfs://192.168.10.48:8020/hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083611133, syncFs=true, hflush=false, compression=false
  16. 2014-04-21 20:33:31,145 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 29479976 synced till here 29479938
  17. 2014-04-21 20:33:31,168 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083562116, entries=235783, filesize=63762282. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083611133
  18. 2014-04-21 20:33:34,983 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Received request to open region: vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750.
  19. 2014-04-21 20:33:34,992 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign: regionserver:60020-0x1441eaf4e9d0002-0x1441eaf4e9d0002-0x1441eaf4e9d0002 Attempting to transition node 2e358369f483855b58c8ea1355fbe750 from M_ZK_REGION_OFFLINE to RS_ZK_REGION_OPENING
  20. 2014-04-21 20:33:34,997 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign: regionserver:60020-0x1441eaf4e9d0002-0x1441eaf4e9d0002-0x1441eaf4e9d0002 Successfully transitioned node 2e358369f483855b58c8ea1355fbe750 from M_ZK_REGION_OFFLINE to RS_ZK_REGION_OPENING
  21. 2014-04-21 20:33:34,997 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Opening region: {NAME => 'vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750.', STARTKEY => '99999wgaxlm.cn.roowei.com', ENDKEY => '', ENCODED => 2e358369f483855b58c8ea1355fbe750,}
  22. 2014-04-21 20:33:34,997 INFO org.apache.hadoop.hbase.regionserver.HRegion: Setting up tabledescriptor config now ...
  23. 2014-04-21 20:33:34,997 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Instantiated vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750.
  24. 2014-04-21 20:33:35,004 INFO org.apache.hadoop.hbase.regionserver.Store: time to purge deletes set to 0ms in store cf
  25. 2014-04-21 20:33:35,004 INFO org.apache.hadoop.hbase.regionserver.Store: hbase.hstore.compaction.min = 3
  26. 2014-04-21 20:33:35,042 DEBUG org.apache.hadoop.hbase.regionserver.Store: loaded hdfs://192.168.10.48:8020/hbase/vc2.host_stat/2e358369f483855b58c8ea1355fbe750/cf/6bf7f4c6a1734f62abcaec8094ae05ab, isReference=false, isBulkLoadResult=false, seqid=20553172, majorCompaction=true
  27. 2014-04-21 20:33:35,065 DEBUG org.apache.hadoop.hbase.regionserver.Store: loaded hdfs://192.168.10.48:8020/hbase/vc2.host_stat/2e358369f483855b58c8ea1355fbe750/cf/86ec627964b54085b6ee04a67652dc36, isReference=false, isBulkLoadResult=false, seqid=20632686, majorCompaction=false
  28. 2014-04-21 20:33:35,070 INFO org.apache.hadoop.hbase.regionserver.HRegion: Onlined vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750.; next sequenceid=20632687
  29. 2014-04-21 20:33:35,070 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign: regionserver:60020-0x1441eaf4e9d0002-0x1441eaf4e9d0002-0x1441eaf4e9d0002 Attempting to transition node 2e358369f483855b58c8ea1355fbe750 from RS_ZK_REGION_OPENING to RS_ZK_REGION_OPENING
  30. 2014-04-21 20:33:35,072 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign: regionserver:60020-0x1441eaf4e9d0002-0x1441eaf4e9d0002-0x1441eaf4e9d0002 Successfully transitioned node 2e358369f483855b58c8ea1355fbe750 from RS_ZK_REGION_OPENING to RS_ZK_REGION_OPENING
  31. 2014-04-21 20:33:35,073 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Post open deploy tasks for region=vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750., daughter=false
  32. 2014-04-21 20:33:35,106 INFO org.apache.hadoop.hbase.catalog.MetaEditor: Updated row vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750. with server=app-hbase-1,60020,1392084194869
  33. 2014-04-21 20:33:35,107 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Done with post open deploy task for region=vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750., daughter=false
  34. 2014-04-21 20:33:35,107 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign: regionserver:60020-0x1441eaf4e9d0002-0x1441eaf4e9d0002-0x1441eaf4e9d0002 Attempting to transition node 2e358369f483855b58c8ea1355fbe750 from RS_ZK_REGION_OPENING to RS_ZK_REGION_OPENED
  35. 2014-04-21 20:33:35,110 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign: regionserver:60020-0x1441eaf4e9d0002-0x1441eaf4e9d0002-0x1441eaf4e9d0002 Successfully transitioned node 2e358369f483855b58c8ea1355fbe750 from RS_ZK_REGION_OPENING to RS_ZK_REGION_OPENED
  36. 2014-04-21 20:33:35,110 DEBUG org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: region transitioned to opened in zookeeper: {NAME => 'vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750.', STARTKEY => '99999wgaxlm.cn.roowei.com', ENDKEY => '', ENCODED => 2e358369f483855b58c8ea1355fbe750,}, server: app-hbase-1,60020,1392084194869
  37. 2014-04-21 20:33:35,110 DEBUG org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Opened vc2.host_stat,99999wgaxlm.cn.roowei.com,1398083567973.2e358369f483855b58c8ea1355fbe750. on server:app-hbase-1,60020,1392084194869
  38. 2014-04-21 20:33:45,219 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.24 MB of total=2.31 GB
  39. 2014-04-21 20:33:45,372 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.25 MB, total=2.04 GB, single=392.77 MB, multi=1.9 GB, memory=0.75 KB
  40. 2014-04-21 20:33:47,457 WARN org.apache.hadoop.ipc.HBaseServer: (responseTooSlow): {"processingtimems":15025,"call":"checkAndPut([B@50ddf64e, [B@1c55a45c, [B@3a99a28c, [B@3c7eb362, null, {\"totalColumns\":1,\"families\":{\"cf\":[{\"timestamp\":1398083627445,\"qualifier\":\"an\",\"vlen\":21}]},\"row\":\"\\\\xE4\\\\x11.\\\\x19 \\\\xB18\\\\xC7\\\\x16\\\\xE0y\\\\xFF\\\\x5COFsN\\\\x16\\\\xAC\\\\xFC\\\\x0E\\\\xB2\\\\xA9H\\\\xC3Q+\\\\xCF\\\\xF5Z\\\\xB9\\\\x12\"}), rpc version=1, client version=29, methodsFingerPrint=-1368823753","client":"192.168.11.171:48657","starttimems":1398083612424,"queuetimems":0,"class":"HRegionServer","responsesize":0,"method":"checkAndPut"}
  41. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/da9fb7beadc74f13b08f667a22de6e56 for block blk_-8527841330504349016_201974:java.io.IOException: Connection reset by peer
  42. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/da9fb7beadc74f13b08f667a22de6e56 for block blk_-8527841330504349016_201974:java.io.IOException: Connection reset by peer
  43. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-493518709885844878_201977:java.io.IOException: Connection reset by peer
  44. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010, add to deadNodes and continuejava.io.IOException: Connection reset by peer
  45. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-5175547038218036928_201976:java.io.IOException: Connection reset by peer
  46. 2014-04-21 20:33:57,181 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/c0bc2c68b2a34de1baf5197c77b1e3ca for block blk_-2696728735676053699_202090:java.io.IOException: Connection reset by peer
  47. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/7c29412097b6465dbd0a6c3d0823c973 for block blk_5966484054166966561_201940:java.io.IOException: Connection reset by peer
  48. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-5175547038218036928_201976:java.io.IOException: Connection reset by peer
  49. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/cf/73cecad00bb043eeb47fcee057b6cafd for block blk_-1508532205064627970_201944:java.io.IOException: Connection reset by peer
  50. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/cf/73cecad00bb043eeb47fcee057b6cafd for block blk_712846744476001848_201944:java.io.IOException: Connection reset by peer
  51. 2014-04-21 20:33:57,182 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-493518709885844878_201977:java.io.IOException: Connection reset by peer
  52. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-5175547038218036928_201976:java.io.IOException: Connection reset by peer
  53. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/cf/73cecad00bb043eeb47fcee057b6cafd for block blk_-5268019645083989093_201945:java.io.IOException: Connection reset by peer
  54. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_3245402824360128594_201977:java.io.IOException: Connection reset by peer
  55. 2014-04-21 20:33:57,182 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/7c29412097b6465dbd0a6c3d0823c973 for block blk_4237358573555752184_201940:java.io.IOException: Connection reset by peer
  56. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010, add to deadNodes and continuejava.io.IOException: Connection reset by peer
  57. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/da9fb7beadc74f13b08f667a22de6e56 for block blk_4065523496808289587_201975:java.io.IOException: Connection reset by peer
  58. 2014-04-21 20:33:57,182 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-107935321851451513_201978:java.io.IOException: Connection reset by peer
  59. 2014-04-21 20:33:57,179 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/cf/73cecad00bb043eeb47fcee057b6cafd for block blk_712846744476001848_201944:java.io.IOException: Connection reset by peer
  60. 2014-04-21 20:33:57,181 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_9153335434868484732_201977:java.io.IOException: Connection reset by peer
  61. 2014-04-21 20:33:57,183 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_3245402824360128594_201977:java.io.IOException: Connection reset by peer
  62. 2014-04-21 20:33:57,188 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/1c7fde4e0ecc4b6cb5b8d75a1d337f57 for block blk_5559451839447260838_202033:java.net.ConnectException: Connection refused
  63. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-107935321851451513_201978:java.net.ConnectException: Connection refused
  64. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-493518709885844878_201977:java.net.ConnectException: Connection refused
  65. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_3245402824360128594_201977:java.net.ConnectException: Connection refused
  66. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_3245402824360128594_201977:java.net.ConnectException: Connection refused
  67. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-493518709885844878_201977:java.net.ConnectException: Connection refused
  68. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-7583437257645631863_201976:java.net.ConnectException: Connection refused
  69. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-107935321851451513_201978:java.net.ConnectException: Connection refused
  70. 2014-04-21 20:33:57,237 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_3245402824360128594_201977:java.net.ConnectException: Connection refused
  71. 2014-04-21 20:33:57,238 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_3245402824360128594_201977:java.net.ConnectException: Connection refused
  72. 2014-04-21 20:33:57,239 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/7c29412097b6465dbd0a6c3d0823c973 for block blk_4237358573555752184_201940:java.net.ConnectException: Connection refused
  73. 2014-04-21 20:33:57,241 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/1a86a0a7426049c48576e7a5ac5e3d26 for block blk_-7583437257645631863_201976:java.net.ConnectException: Connection refused
  74. 2014-04-21 20:33:57,251 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/da9fb7beadc74f13b08f667a22de6e56 for block blk_-8641742872607680877_201975:java.net.ConnectException: Connection refused
  75. 2014-04-21 20:33:57,252 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/da9fb7beadc74f13b08f667a22de6e56 for block blk_243732918380266339_201975:java.net.ConnectException: Connection refused
  76. 2014-04-21 20:33:59,290 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/da9fb7beadc74f13b08f667a22de6e56 for block blk_-4981931574498305895_201974:java.io.IOException: Connection reset by peer
  77. 2014-04-21 20:33:59,291 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block blk_-8192027677874709918_202165java.io.EOFException
  78. at java.io.DataInputStream.readFully(DataInputStream.java:197)
  79. at java.io.DataInputStream.readLong(DataInputStream.java:416)
  80. at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:124)
  81. at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2967)
  82.  
  83. 2014-04-21 20:33:59,291 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_-8192027677874709918_202165 bad datanode[0] 192.168.10.45:50010
  84. 2014-04-21 20:33:59,291 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_-8192027677874709918_202165 in pipeline 192.168.10.45:50010, 192.168.10.50:50010, 192.168.10.49:50010: bad datanode 192.168.10.45:50010
  85. 2014-04-21 20:33:59,363 WARN org.apache.hadoop.hbase.regionserver.wal.HLog: HDFS pipeline error detected. Found 2 replicas but expecting no less than 3 replicas. Requesting close of hlog.
  86. 2014-04-21 20:33:59,363 WARN org.apache.hadoop.hbase.regionserver.wal.HLog: HDFS pipeline error detected. Found 2 replicas but expecting no less than 3 replicas. Requesting close of hlog.
  87. 2014-04-21 20:33:59,363 DEBUG org.apache.hadoop.hbase.regionserver.LogRoller: HLog roll requested
  88. 2014-04-21 20:33:59,363 WARN org.apache.hadoop.hbase.regionserver.wal.HLog: HDFS pipeline error detected. Found 2 replicas but expecting no less than 3 replicas. Requesting close of hlog.
  89. 2014-04-21 20:33:59,363 WARN org.apache.hadoop.hbase.regionserver.wal.HLog: HDFS pipeline error detected. Found 2 replicas but expecting no less than 3 replicas. Requesting close of hlog.
  90. 2014-04-21 20:33:59,363 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  91. 2014-04-21 20:33:59,364 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  92. 2014-04-21 20:33:59,365 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: using new createWriter -- HADOOP-6840
  93. 2014-04-21 20:33:59,365 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: Path=hdfs://192.168.10.48:8020/hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083639363, syncFs=true, hflush=false, compression=false
  94. 2014-04-21 20:33:59,365 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 29570797 synced till here 29570687
  95. 2014-04-21 20:33:59,384 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083611133, entries=90821, filesize=24738597. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083639363
  96. 2014-04-21 20:33:59,392 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  97. 2014-04-21 20:33:59,392 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_-5175028256030644731_202175
  98. 2014-04-21 20:33:59,393 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  99. 2014-04-21 20:34:00,017 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/0ff263f54b25235c7d8a6fc20952540e/cf/5dcfe1e774f8485ea536164d4bd085ec for block blk_-5388257311454664462_201680:java.net.ConnectException: Connection refused
  100. 2014-04-21 20:34:00,129 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/cf/4cc6cd68f00242308a8f2d1d38d1bd80 for block blk_8924402542700605951_202124:java.net.ConnectException: Connection refused
  101. 2014-04-21 20:34:00,287 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.in_link/e3febfabc441ced238d12864b1dc8b98/cf/5e706dfd049f4831a5e38a9205e3bdad for block blk_-8287796395462447380_201603:java.net.ConnectException: Connection refused
  102. 2014-04-21 20:34:00,346 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/0ff263f54b25235c7d8a6fc20952540e/cf/e03da2e094ff46628596ae9766824f58 for block blk_6699617513659970116_202012:java.net.ConnectException: Connection refused
  103. 2014-04-21 20:34:00,402 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/0ff263f54b25235c7d8a6fc20952540e/cf/fa5fdc76920c4faaa80e1c903945257b for block blk_-5764442240928257544_202115:java.net.ConnectException: Connection refused
  104. 2014-04-21 20:34:00,503 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/cbebd7c0e62b58879caf08915a66a8be/cf/6023cf0466d142c3ad36eadfb041fd08 for block blk_5756763453034850966_202083:java.net.ConnectException: Connection refused
  105. 2014-04-21 20:34:00,554 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.out_link/cbebd7c0e62b58879caf08915a66a8be/cf/5adb685d0c2245b0bf0f4ce03fa1b9c7 for block blk_7292601947361833838_201678:java.net.ConnectException: Connection refused
  106. 2014-04-21 20:34:00,678 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.in_link/3ba99c07e796ee31f9e13b4f9728b7a7/cf/04ec7638fceb4e4c8129c6e278408b63 for block blk_6889886357204153413_201591:java.net.ConnectException: Connection refused
  107. 2014-04-21 20:34:00,840 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3b678802fd004e7fbe7fb8c00a3cee0e for block blk_8260506063834175013_202011:java.net.ConnectException: Connection refused
  108. 2014-04-21 20:34:00,880 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  109. 2014-04-21 20:34:00,926 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.28 MB, total=2.04 GB, single=359.7 MB, multi=1.94 GB, memory=0.75 KB
  110. 2014-04-21 20:34:00,978 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.in_link/e3febfabc441ced238d12864b1dc8b98/cf/9d77fad258cd446a9638531500ff2052 for block blk_-1193692318886338237_201972:java.net.ConnectException: Connection refused
  111. 2014-04-21 20:34:01,112 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.in_link/3ba99c07e796ee31f9e13b4f9728b7a7/cf/f6f2eddd560949289739983cdd0f1154 for block blk_5023399031920214940_201968:java.net.ConnectException: Connection refused
  112. 2014-04-21 20:34:01,112 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.host_spd/d18ef8782fdb452c7b8286287e967385/cf/5a41f379062645afb0eaba07dc97ea71 for block blk_-2366095068979481404_201989:java.net.ConnectException: Connection refused
  113. 2014-04-21 20:34:01,944 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010 for file /hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/a1c301429c434082a26940aa2aace319 for block blk_4013334348303759524_202014:java.net.ConnectException: Connection refused
  114. 2014-04-21 20:34:12,726 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.26 MB of total=2.31 GB
  115. 2014-04-21 20:34:12,770 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.28 MB, total=2.04 GB, single=324.75 MB, multi=1.97 GB, memory=0.75 KB
  116. 2014-04-21 20:34:22,352 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.21 MB of total=2.31 GB
  117. 2014-04-21 20:34:22,375 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=319.99 MB, multi=1.97 GB, memory=0.75 KB
  118. 2014-04-21 20:34:32,002 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Flush requested on vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a.
  119. 2014-04-21 20:34:32,002 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Started memstore flush for vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a., current region memstore size 128.0m
  120. 2014-04-21 20:34:32,025 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a., commencing wait for mvcc, flushsize=134232816
  121. 2014-04-21 20:34:32,025 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting, commencing flushing stores
  122. 2014-04-21 20:34:32,479 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/5646191f7e034e178d6d5fb2dc790a95 with permission=rwxrwxrwx
  123. 2014-04-21 20:34:32,480 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.24 MB of total=2.31 GB
  124. 2014-04-21 20:34:32,490 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  125. 2014-04-21 20:34:32,490 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  126. 2014-04-21 20:34:32,505 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  127. 2014-04-21 20:34:32,505 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/5646191f7e034e178d6d5fb2dc790a95: CompoundBloomFilterWriter
  128. 2014-04-21 20:34:32,522 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  129. 2014-04-21 20:34:32,522 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_5036929755589957887_202176
  130. 2014-04-21 20:34:32,527 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  131. 2014-04-21 20:34:32,611 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=293.97 MB, multi=2 GB, memory=0.75 KB
  132. 2014-04-21 20:34:33,930 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/5646191f7e034e178d6d5fb2dc790a95)
  133. 2014-04-21 20:34:33,930 INFO org.apache.hadoop.hbase.regionserver.Store: Flushed , sequenceid=31319458, memsize=128.0m, into tmp file hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/5646191f7e034e178d6d5fb2dc790a95
  134. 2014-04-21 20:34:33,963 DEBUG org.apache.hadoop.hbase.regionserver.Store: Renaming flushed file at hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/5646191f7e034e178d6d5fb2dc790a95 to hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95
  135. 2014-04-21 20:34:33,985 INFO org.apache.hadoop.hbase.regionserver.Store: Added hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95, entries=659801, sequenceid=31319458, filesize=40.7m
  136. 2014-04-21 20:34:34,003 INFO org.apache.hadoop.hbase.regionserver.HRegion: Finished memstore flush of ~128.0m/134232816, currentsize=806.3k/825664 for region vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a. in 2001ms, sequenceid=31319458, compaction requested=true
  137. 2014-04-21 20:34:34,004 DEBUG org.apache.hadoop.hbase.regionserver.Store: f609e3b786a28a99a2d0eb2c153c340a - cf: Initiating minorcompaction
  138. 2014-04-21 20:34:34,004 INFO org.apache.hadoop.hbase.regionserver.HRegion: Starting compaction on cf in region vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a.
  139. 2014-04-21 20:34:34,005 INFO org.apache.hadoop.hbase.regionserver.Store: Starting compaction of 3 file(s) in cf of vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a. into tmpdir=hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp, seqid=31319458, totalSize=205.5m
  140. 2014-04-21 20:34:34,005 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/1c7fde4e0ecc4b6cb5b8d75a1d337f57, keycount=1969130, bloomtype=NONE, size=123.7m, encoding=NONE
  141. 2014-04-21 20:34:34,005 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/c0bc2c68b2a34de1baf5197c77b1e3ca, keycount=657177, bloomtype=NONE, size=41.1m, encoding=NONE
  142. 2014-04-21 20:34:34,005 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95, keycount=659801, bloomtype=NONE, size=40.7m, encoding=NONE
  143. 2014-04-21 20:34:34,005 DEBUG org.apache.hadoop.hbase.regionserver.CompactSplitThread: Small Compaction requested: regionName=vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a., storeName=cf, fileCount=3, fileSize=205.5m (123.7m, 41.1m, 40.7m), priority=3, time=6071741458034492; Because: regionserver60020.cacheFlusher; compaction_queue=(0:0), split_queue=0
  144. 2014-04-21 20:34:34,026 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/bdcde88198fb44769314a3b11e7f9ad5 with permission=rwxrwxrwx
  145. 2014-04-21 20:34:34,027 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  146. 2014-04-21 20:34:34,027 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  147. 2014-04-21 20:34:34,034 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  148. 2014-04-21 20:34:34,034 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/bdcde88198fb44769314a3b11e7f9ad5: CompoundBloomFilterWriter
  149. 2014-04-21 20:34:34,042 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  150. 2014-04-21 20:34:34,043 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_6058106771799426655_202177
  151. 2014-04-21 20:34:34,044 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  152. 2014-04-21 20:34:34,661 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner -1091278424590325457 lease expired on region vc2.url_db,\x1F\xF8Ti?\x95\xF4\xFA\x11\x7Fm\x97\xD9\xC6\xFF\x88,1398079780746.c156f751154ba579f17f9fbb20aa88e5.
  153. 2014-04-21 20:34:34,681 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner -1046818754624871045 lease expired on region vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8.
  154. 2014-04-21 20:34:37,261 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner 3487677922408117083 lease expired on region vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060.
  155. 2014-04-21 20:34:44,970 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/bdcde88198fb44769314a3b11e7f9ad5)
  156. 2014-04-21 20:34:45,006 INFO org.apache.hadoop.hbase.regionserver.Store: Renaming compacted file at hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/.tmp/bdcde88198fb44769314a3b11e7f9ad5 to hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/bdcde88198fb44769314a3b11e7f9ad5
  157. 2014-04-21 20:34:45,028 DEBUG org.apache.hadoop.hbase.regionserver.Store: Removing store files after compaction...
  158. 2014-04-21 20:34:45,030 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving compacted store files.
  159. 2014-04-21 20:34:45,030 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Starting to archive files:[class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/1c7fde4e0ecc4b6cb5b8d75a1d337f57, class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/c0bc2c68b2a34de1baf5197c77b1e3ca, class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95]
  160. 2014-04-21 20:34:45,030 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: moving files to the archive directory: hdfs://192.168.10.48:8020/hbase/.archive/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf
  161. 2014-04-21 20:34:45,032 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/1c7fde4e0ecc4b6cb5b8d75a1d337f57
  162. 2014-04-21 20:34:45,033 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/1c7fde4e0ecc4b6cb5b8d75a1d337f57, free to archive original file.
  163. 2014-04-21 20:34:45,060 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/1c7fde4e0ecc4b6cb5b8d75a1d337f57, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/1c7fde4e0ecc4b6cb5b8d75a1d337f57
  164. 2014-04-21 20:34:45,060 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/c0bc2c68b2a34de1baf5197c77b1e3ca
  165. 2014-04-21 20:34:45,061 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/c0bc2c68b2a34de1baf5197c77b1e3ca, free to archive original file.
  166. 2014-04-21 20:34:45,119 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/c0bc2c68b2a34de1baf5197c77b1e3ca, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/c0bc2c68b2a34de1baf5197c77b1e3ca
  167. 2014-04-21 20:34:45,120 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95
  168. 2014-04-21 20:34:45,121 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95, free to archive original file.
  169. 2014-04-21 20:34:45,136 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.out_link/f609e3b786a28a99a2d0eb2c153c340a/cf/5646191f7e034e178d6d5fb2dc790a95
  170. 2014-04-21 20:34:45,136 INFO org.apache.hadoop.hbase.regionserver.Store: Completed compaction of 3 file(s) in cf of vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a. into bdcde88198fb44769314a3b11e7f9ad5, size=202.2m; total size for store is 970.8m
  171. 2014-04-21 20:34:45,136 INFO org.apache.hadoop.hbase.regionserver.compactions.CompactionRequest: completed compaction: regionName=vc2.out_link,=^3\xF7\xE1\x0F\xE1V\xA1K\x09\xC9\xB2\xA0\x98 t "\xC5up\x92XW\x8C\xE3y\xA3p\x17P,1398077158522.f609e3b786a28a99a2d0eb2c153c340a., storeName=cf, fileCount=3, fileSize=205.5m, priority=3, time=6071741458034492; duration=11sec
  172. 2014-04-21 20:34:45,137 DEBUG org.apache.hadoop.hbase.regionserver.compactions.CompactionRequest: CompactSplitThread status: compaction_queue=(0:0), split_queue=0
  173. 2014-04-21 20:34:46,596 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.26 MB of total=2.31 GB
  174. 2014-04-21 20:34:46,619 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=261.62 MB, multi=2.03 GB, memory=0.75 KB
  175. 2014-04-21 20:34:49,853 DEBUG org.apache.hadoop.hbase.regionserver.LogRoller: HLog roll requested
  176. 2014-04-21 20:34:49,853 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  177. 2014-04-21 20:34:49,853 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  178. 2014-04-21 20:34:49,859 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: using new createWriter -- HADOOP-6840
  179. 2014-04-21 20:34:49,860 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: Path=hdfs://192.168.10.48:8020/hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083689853, syncFs=true, hflush=false, compression=false
  180. 2014-04-21 20:34:49,860 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 29804704 synced till here 29804616
  181. 2014-04-21 20:34:49,879 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083639363, entries=233907, filesize=63773457. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083689853
  182. 2014-04-21 20:34:49,879 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: Log roll failed and will be retried. (This is not an error)
  183. 2014-04-21 20:34:49,889 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  184. 2014-04-21 20:34:49,889 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_4132953154887902164_202178
  185. 2014-04-21 20:34:49,894 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  186. 2014-04-21 20:34:57,693 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.26 MB of total=2.31 GB
  187. 2014-04-21 20:34:57,750 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=261.65 MB, multi=2.03 GB, memory=0.75 KB
  188. 2014-04-21 20:35:00,953 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server listener on 60020: readAndProcess threw exception java.io.IOException: Connection reset by peer. Count of bytes read: 0
  189. java.io.IOException: Connection reset by peer
  190. at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
  191. at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
  192. at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:218)
  193. at sun.nio.ch.IOUtil.read(IOUtil.java:191)
  194. at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:359)
  195. at org.apache.hadoop.hbase.ipc.HBaseServer.channelRead(HBaseServer.java:1796)
  196. at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1179)
  197. at org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:748)
  198. at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:539)
  199. at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:514)
  200. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
  201. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
  202. at java.lang.Thread.run(Thread.java:722)
  203. 2014-04-21 20:35:06,996 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server listener on 60020: readAndProcess threw exception java.io.IOException: Connection reset by peer. Count of bytes read: 0
  204. java.io.IOException: Connection reset by peer
  205. at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
  206. at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
  207. at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:218)
  208. at sun.nio.ch.IOUtil.read(IOUtil.java:191)
  209. at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:359)
  210. at org.apache.hadoop.hbase.ipc.HBaseServer.channelRead(HBaseServer.java:1796)
  211. at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1179)
  212. at org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:748)
  213. at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:539)
  214. at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:514)
  215. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
  216. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
  217. at java.lang.Thread.run(Thread.java:722)
  218. 2014-04-21 20:35:08,273 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.26 MB of total=2.31 GB
  219. 2014-04-21 20:35:08,307 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.31 MB, total=2.04 GB, single=298.71 MB, multi=2 GB, memory=0.75 KB
  220. 2014-04-21 20:35:11,685 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server Responder, call next(-2550130886002892068, 5000), rpc version=1, client version=29, methodsFingerPrint=-1368823753 from 192.168.10.45:50204: output error
  221. 2014-04-21 20:35:11,687 INFO org.apache.hadoop.ipc.HBaseServer: IPC Server Responder: doAsyncWrite threw exception java.nio.channels.AsynchronousCloseException
  222. 2014-04-21 20:35:13,689 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Flush requested on vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8.
  223. 2014-04-21 20:35:13,689 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Started memstore flush for vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8., current region memstore size 128.0m
  224. 2014-04-21 20:35:13,695 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8., commencing wait for mvcc, flushsize=134226616
  225. 2014-04-21 20:35:13,695 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting, commencing flushing stores
  226. 2014-04-21 20:35:13,864 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/a142d605347f4438904e604806dfcd2a with permission=rwxrwxrwx
  227. 2014-04-21 20:35:13,865 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  228. 2014-04-21 20:35:13,865 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  229. 2014-04-21 20:35:13,867 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  230. 2014-04-21 20:35:13,868 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/a142d605347f4438904e604806dfcd2a: CompoundBloomFilterWriter
  231. 2014-04-21 20:35:13,876 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  232. 2014-04-21 20:35:13,876 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_-6794611716757933506_202182
  233. 2014-04-21 20:35:13,877 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  234. 2014-04-21 20:35:14,893 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/a142d605347f4438904e604806dfcd2a)
  235. 2014-04-21 20:35:14,894 INFO org.apache.hadoop.hbase.regionserver.Store: Flushed , sequenceid=31516135, memsize=128.0m, into tmp file hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/a142d605347f4438904e604806dfcd2a
  236. 2014-04-21 20:35:14,913 DEBUG org.apache.hadoop.hbase.regionserver.Store: Renaming flushed file at hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/a142d605347f4438904e604806dfcd2a to hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a
  237. 2014-04-21 20:35:14,934 INFO org.apache.hadoop.hbase.regionserver.Store: Added hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a, entries=704318, sequenceid=31516135, filesize=35.4m
  238. 2014-04-21 20:35:14,964 INFO org.apache.hadoop.hbase.regionserver.HRegion: Finished memstore flush of ~128.0m/134226616, currentsize=693.8k/710464 for region vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8. in 1275ms, sequenceid=31516135, compaction requested=true
  239. 2014-04-21 20:35:14,964 DEBUG org.apache.hadoop.hbase.regionserver.Store: f78b68b39dbcf1cb0cb739e4e245abc8 - cf: Initiating minorcompaction
  240. 2014-04-21 20:35:14,964 INFO org.apache.hadoop.hbase.regionserver.HRegion: Starting compaction on cf in region vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8.
  241. 2014-04-21 20:35:14,964 INFO org.apache.hadoop.hbase.regionserver.Store: Starting compaction of 3 file(s) in cf of vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8. into tmpdir=hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp, seqid=31516135, totalSize=106.0m
  242. 2014-04-21 20:35:14,965 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3b678802fd004e7fbe7fb8c00a3cee0e, keycount=705231, bloomtype=NONE, size=35.2m, encoding=NONE
  243. 2014-04-21 20:35:14,965 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/2ad0f7b5e549430fb1efef0267df3b85, keycount=703865, bloomtype=NONE, size=35.3m, encoding=NONE
  244. 2014-04-21 20:35:14,965 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a, keycount=704318, bloomtype=NONE, size=35.4m, encoding=NONE
  245. 2014-04-21 20:35:14,965 DEBUG org.apache.hadoop.hbase.regionserver.CompactSplitThread: Small Compaction requested: regionName=vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8., storeName=cf, fileCount=3, fileSize=106.0m (35.2m, 35.3m, 35.4m), priority=3, time=6071782418024169; Because: regionserver60020.cacheFlusher; compaction_queue=(0:0), split_queue=0
  246. 2014-04-21 20:35:14,977 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/3f6019addd3342d590e3cd1cc6717b46 with permission=rwxrwxrwx
  247. 2014-04-21 20:35:14,978 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  248. 2014-04-21 20:35:14,978 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  249. 2014-04-21 20:35:14,980 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  250. 2014-04-21 20:35:14,980 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/3f6019addd3342d590e3cd1cc6717b46: CompoundBloomFilterWriter
  251. 2014-04-21 20:35:14,985 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  252. 2014-04-21 20:35:14,985 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_4389195637310675763_202183
  253. 2014-04-21 20:35:14,986 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  254. 2014-04-21 20:35:16,586 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.26 MB of total=2.31 GB
  255. 2014-04-21 20:35:16,659 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.28 MB, total=2.04 GB, single=366.31 MB, multi=1.93 GB, memory=0.75 KB
  256. 2014-04-21 20:35:18,437 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner 2784613815677670853 lease expired on region vc2.url_db,\x1F\xF8Ti?\x95\xF4\xFA\x11\x7Fm\x97\xD9\xC6\xFF\x88,1398079780746.c156f751154ba579f17f9fbb20aa88e5.
  257. 2014-04-21 20:35:21,362 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/3f6019addd3342d590e3cd1cc6717b46)
  258. 2014-04-21 20:35:21,374 INFO org.apache.hadoop.hbase.regionserver.Store: Renaming compacted file at hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/.tmp/3f6019addd3342d590e3cd1cc6717b46 to hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3f6019addd3342d590e3cd1cc6717b46
  259. 2014-04-21 20:35:21,398 DEBUG org.apache.hadoop.hbase.regionserver.Store: Removing store files after compaction...
  260. 2014-04-21 20:35:21,400 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving compacted store files.
  261. 2014-04-21 20:35:21,400 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Starting to archive files:[class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3b678802fd004e7fbe7fb8c00a3cee0e, class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/2ad0f7b5e549430fb1efef0267df3b85, class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a]
  262. 2014-04-21 20:35:21,400 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: moving files to the archive directory: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf
  263. 2014-04-21 20:35:21,401 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3b678802fd004e7fbe7fb8c00a3cee0e
  264. 2014-04-21 20:35:21,401 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3b678802fd004e7fbe7fb8c00a3cee0e, free to archive original file.
  265. 2014-04-21 20:35:21,416 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3b678802fd004e7fbe7fb8c00a3cee0e, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/3b678802fd004e7fbe7fb8c00a3cee0e
  266. 2014-04-21 20:35:21,417 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/2ad0f7b5e549430fb1efef0267df3b85
  267. 2014-04-21 20:35:21,419 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/2ad0f7b5e549430fb1efef0267df3b85, free to archive original file.
  268. 2014-04-21 20:35:21,455 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/2ad0f7b5e549430fb1efef0267df3b85, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/2ad0f7b5e549430fb1efef0267df3b85
  269. 2014-04-21 20:35:21,455 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a
  270. 2014-04-21 20:35:21,456 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a, free to archive original file.
  271. 2014-04-21 20:35:21,475 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/f78b68b39dbcf1cb0cb739e4e245abc8/cf/a142d605347f4438904e604806dfcd2a
  272. 2014-04-21 20:35:21,475 INFO org.apache.hadoop.hbase.regionserver.Store: Completed compaction of 3 file(s) in cf of vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8. into 3f6019addd3342d590e3cd1cc6717b46, size=105.9m; total size for store is 626.0m
  273. 2014-04-21 20:35:21,475 INFO org.apache.hadoop.hbase.regionserver.compactions.CompactionRequest: completed compaction: regionName=vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8., storeName=cf, fileCount=3, fileSize=106.0m, priority=3, time=6071782418024169; duration=6sec
  274. 2014-04-21 20:35:21,475 DEBUG org.apache.hadoop.hbase.regionserver.compactions.CompactionRequest: CompactSplitThread status: compaction_queue=(0:0), split_queue=0
  275. 2014-04-21 20:35:23,630 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Flush requested on vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060.
  276. 2014-04-21 20:35:23,631 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Started memstore flush for vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060., current region memstore size 128.0m
  277. 2014-04-21 20:35:23,651 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060., commencing wait for mvcc, flushsize=134234792
  278. 2014-04-21 20:35:23,651 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting, commencing flushing stores
  279. 2014-04-21 20:35:23,799 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/46ab9b9a2a714065811d53b711f944b6 with permission=rwxrwxrwx
  280. 2014-04-21 20:35:23,801 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  281. 2014-04-21 20:35:23,801 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  282. 2014-04-21 20:35:23,803 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  283. 2014-04-21 20:35:23,804 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/46ab9b9a2a714065811d53b711f944b6: CompoundBloomFilterWriter
  284. 2014-04-21 20:35:23,817 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  285. 2014-04-21 20:35:23,817 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_1175398292422239128_202189
  286. 2014-04-21 20:35:23,818 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  287. 2014-04-21 20:35:24,755 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/46ab9b9a2a714065811d53b711f944b6)
  288. 2014-04-21 20:35:24,756 INFO org.apache.hadoop.hbase.regionserver.Store: Flushed , sequenceid=31566401, memsize=128.0m, into tmp file hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/46ab9b9a2a714065811d53b711f944b6
  289. 2014-04-21 20:35:24,769 DEBUG org.apache.hadoop.hbase.regionserver.Store: Renaming flushed file at hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/46ab9b9a2a714065811d53b711f944b6 to hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6
  290. 2014-04-21 20:35:24,787 INFO org.apache.hadoop.hbase.regionserver.Store: Added hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6, entries=704372, sequenceid=31566401, filesize=35.3m
  291. 2014-04-21 20:35:24,812 INFO org.apache.hadoop.hbase.regionserver.HRegion: Finished memstore flush of ~128.0m/134234792, currentsize=530.4k/543176 for region vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060. in 1180ms, sequenceid=31566401, compaction requested=true
  292. 2014-04-21 20:35:24,815 DEBUG org.apache.hadoop.hbase.regionserver.Store: 591173c70c8d236b40c9325aaedd0060 - cf: Initiating minorcompaction
  293. 2014-04-21 20:35:24,815 DEBUG org.apache.hadoop.hbase.regionserver.CompactSplitThread: Small Compaction requested: regionName=vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060., storeName=cf, fileCount=3, fileSize=105.9m (35.2m, 35.3m, 35.3m), priority=3, time=6071792268986091; Because: regionserver60020.cacheFlusher; compaction_queue=(0:0), split_queue=0
  294. 2014-04-21 20:35:24,815 INFO org.apache.hadoop.hbase.regionserver.HRegion: Starting compaction on cf in region vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060.
  295. 2014-04-21 20:35:24,816 INFO org.apache.hadoop.hbase.regionserver.Store: Starting compaction of 3 file(s) in cf of vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060. into tmpdir=hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp, seqid=31566401, totalSize=105.9m
  296. 2014-04-21 20:35:24,816 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/a1c301429c434082a26940aa2aace319, keycount=704323, bloomtype=NONE, size=35.2m, encoding=NONE
  297. 2014-04-21 20:35:24,816 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/f6c28df82b4c45df857fdf75c62db568, keycount=703834, bloomtype=NONE, size=35.3m, encoding=NONE
  298. 2014-04-21 20:35:24,816 DEBUG org.apache.hadoop.hbase.regionserver.Compactor: Compacting hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6, keycount=704372, bloomtype=NONE, size=35.3m, encoding=NONE
  299. 2014-04-21 20:35:24,826 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/27512b6235064a749061fc225b36cad0 with permission=rwxrwxrwx
  300. 2014-04-21 20:35:24,827 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  301. 2014-04-21 20:35:24,827 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  302. 2014-04-21 20:35:24,829 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  303. 2014-04-21 20:35:24,829 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/27512b6235064a749061fc225b36cad0: CompoundBloomFilterWriter
  304. 2014-04-21 20:35:24,833 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  305. 2014-04-21 20:35:24,833 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_-4752808618326281576_202190
  306. 2014-04-21 20:35:24,834 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  307. 2014-04-21 20:35:26,051 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.26 MB of total=2.31 GB
  308. 2014-04-21 20:35:26,092 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=433.39 MB, multi=1.86 GB, memory=0.75 KB
  309. 2014-04-21 20:35:29,402 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/27512b6235064a749061fc225b36cad0)
  310. 2014-04-21 20:35:29,428 INFO org.apache.hadoop.hbase.regionserver.Store: Renaming compacted file at hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/.tmp/27512b6235064a749061fc225b36cad0 to hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/27512b6235064a749061fc225b36cad0
  311. 2014-04-21 20:35:29,451 DEBUG org.apache.hadoop.hbase.regionserver.Store: Removing store files after compaction...
  312. 2014-04-21 20:35:29,454 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving compacted store files.
  313. 2014-04-21 20:35:29,455 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Starting to archive files:[class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/a1c301429c434082a26940aa2aace319, class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/f6c28df82b4c45df857fdf75c62db568, class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6]
  314. 2014-04-21 20:35:29,455 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: moving files to the archive directory: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf
  315. 2014-04-21 20:35:29,456 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/a1c301429c434082a26940aa2aace319
  316. 2014-04-21 20:35:29,458 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/a1c301429c434082a26940aa2aace319, free to archive original file.
  317. 2014-04-21 20:35:29,468 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/a1c301429c434082a26940aa2aace319, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/a1c301429c434082a26940aa2aace319
  318. 2014-04-21 20:35:29,468 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/f6c28df82b4c45df857fdf75c62db568
  319. 2014-04-21 20:35:29,469 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/f6c28df82b4c45df857fdf75c62db568, free to archive original file.
  320. 2014-04-21 20:35:29,483 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/f6c28df82b4c45df857fdf75c62db568, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/f6c28df82b4c45df857fdf75c62db568
  321. 2014-04-21 20:35:29,483 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Archiving:class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6
  322. 2014-04-21 20:35:29,484 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: No existing file in archive for:hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6, free to archive original file.
  323. 2014-04-21 20:35:29,500 DEBUG org.apache.hadoop.hbase.backup.HFileArchiver: Finished archiving file from: class org.apache.hadoop.hbase.backup.HFileArchiver$FileableStoreFile, file:hdfs://192.168.10.48:8020/hbase/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6, to: hdfs://192.168.10.48:8020/hbase/.archive/vc2.url_db/591173c70c8d236b40c9325aaedd0060/cf/46ab9b9a2a714065811d53b711f944b6
  324. 2014-04-21 20:35:29,500 INFO org.apache.hadoop.hbase.regionserver.Store: Completed compaction of 3 file(s) in cf of vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060. into 27512b6235064a749061fc225b36cad0, size=105.8m; total size for store is 625.7m
  325. 2014-04-21 20:35:29,500 INFO org.apache.hadoop.hbase.regionserver.compactions.CompactionRequest: completed compaction: regionName=vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060., storeName=cf, fileCount=3, fileSize=105.9m, priority=3, time=6071792268986091; duration=4sec
  326. 2014-04-21 20:35:29,500 DEBUG org.apache.hadoop.hbase.regionserver.compactions.CompactionRequest: CompactSplitThread status: compaction_queue=(0:0), split_queue=0
  327. 2014-04-21 20:35:34,013 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  328. 2014-04-21 20:35:34,052 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.29 MB, total=2.04 GB, single=453.62 MB, multi=1.84 GB, memory=0.75 KB
  329. 2014-04-21 20:35:34,431 DEBUG org.apache.hadoop.hbase.regionserver.LogRoller: HLog roll requested
  330. 2014-04-21 20:35:34,431 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  331. 2014-04-21 20:35:34,431 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  332. 2014-04-21 20:35:34,433 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: using new createWriter -- HADOOP-6840
  333. 2014-04-21 20:35:34,433 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: Path=hdfs://192.168.10.48:8020/hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083734431, syncFs=true, hflush=false, compression=false
  334. 2014-04-21 20:35:34,433 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 30035574 synced till here 30035523
  335. 2014-04-21 20:35:34,448 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083689853, entries=230870, filesize=63760944. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083734431
  336. 2014-04-21 20:35:34,453 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  337. 2014-04-21 20:35:34,453 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_7606500647682703423_202193
  338. 2014-04-21 20:35:34,454 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  339. 2014-04-21 20:35:34,740 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010, add to deadNodes and continuejava.net.ConnectException: Connection refused
  340. 2014-04-21 20:35:35,214 WARN org.apache.hadoop.hdfs.DFSClient: Failed to connect to /192.168.10.45:50010, add to deadNodes and continuejava.net.ConnectException: Connection refused
  341. 2014-04-21 20:35:40,634 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.23 MB of total=2.31 GB
  342. 2014-04-21 20:35:40,660 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=449.57 MB, multi=1.85 GB, memory=0.75 KB
  343. 2014-04-21 20:35:42,199 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner -2380914875480788455 lease expired on region vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8.
  344. 2014-04-21 20:35:47,731 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.24 MB of total=2.31 GB
  345. 2014-04-21 20:35:47,768 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=457.23 MB, multi=1.84 GB, memory=0.75 KB
  346. 2014-04-21 20:35:56,074 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  347. 2014-04-21 20:35:56,101 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.3 MB, total=2.04 GB, single=473.99 MB, multi=1.82 GB, memory=0.75 KB
  348. 2014-04-21 20:36:00,613 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner 643485387835722911 lease expired on region vc2.url_db,?\xFF$\x18\xF5\xF5\x1FnZ@\x1Avb\x9B\xA1p,1398080125300.f78b68b39dbcf1cb0cb739e4e245abc8.
  349. 2014-04-21 20:36:04,301 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.32 MB of total=2.31 GB
  350. 2014-04-21 20:36:04,331 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.34 MB, total=2.04 GB, single=469.95 MB, multi=1.83 GB, memory=0.75 KB
  351. 2014-04-21 20:36:06,749 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner 366257153210059587 lease expired on region vc2.url_db,\x1F\xF8Ti?\x95\xF4\xFA\x11\x7Fm\x97\xD9\xC6\xFF\x88,1398079780746.c156f751154ba579f17f9fbb20aa88e5.
  352. 2014-04-21 20:36:08,598 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner 1409038748614583862 lease expired on region vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060.
  353. 2014-04-21 20:36:11,318 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: Scanner -2550130886002892068 lease expired on region vc2.url_db,_\xF1\xEC\xFD4\xF9a\xDEI&9\xC8\xFA\xDC\xCAL,1398080125300.591173c70c8d236b40c9325aaedd0060.
  354. 2014-04-21 20:36:12,502 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  355. 2014-04-21 20:36:12,533 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.29 MB, total=2.04 GB, single=474.53 MB, multi=1.82 GB, memory=0.75 KB
  356. 2014-04-21 20:36:17,726 DEBUG org.apache.hadoop.hbase.regionserver.LogRoller: HLog roll requested
  357. 2014-04-21 20:36:17,726 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  358. 2014-04-21 20:36:17,727 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  359. 2014-04-21 20:36:17,733 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: using new createWriter -- HADOOP-6840
  360. 2014-04-21 20:36:17,733 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: Path=hdfs://192.168.10.48:8020/hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083777726, syncFs=true, hflush=false, compression=false
  361. 2014-04-21 20:36:17,733 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 30267969 synced till here 30267857
  362. 2014-04-21 20:36:17,754 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083734431, entries=232395, filesize=63764614. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083777726
  363. 2014-04-21 20:36:17,765 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  364. 2014-04-21 20:36:17,765 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_3758778249012031287_202203
  365. 2014-04-21 20:36:17,766 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  366. 2014-04-21 20:36:22,230 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.32 MB of total=2.31 GB
  367. 2014-04-21 20:36:22,277 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.35 MB, total=2.04 GB, single=471.39 MB, multi=1.83 GB, memory=0.75 KB
  368. 2014-04-21 20:36:29,905 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.24 MB of total=2.31 GB
  369. 2014-04-21 20:36:29,939 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.25 MB, total=2.04 GB, single=467.61 MB, multi=1.83 GB, memory=0.75 KB
  370. 2014-04-21 20:36:37,034 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  371. 2014-04-21 20:36:37,164 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.29 MB, total=2.04 GB, single=464.77 MB, multi=1.83 GB, memory=0.75 KB
  372. 2014-04-21 20:36:45,230 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  373. 2014-04-21 20:36:45,254 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=469.06 MB, multi=1.83 GB, memory=0.75 KB
  374. 2014-04-21 20:36:53,865 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.31 MB of total=2.31 GB
  375. 2014-04-21 20:36:53,908 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.35 MB, total=2.04 GB, single=461.11 MB, multi=1.84 GB, memory=0.75 KB
  376. 2014-04-21 20:37:01,020 DEBUG org.apache.hadoop.hbase.regionserver.LogRoller: HLog roll requested
  377. 2014-04-21 20:37:01,020 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  378. 2014-04-21 20:37:01,020 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  379. 2014-04-21 20:37:01,023 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: using new createWriter -- HADOOP-6840
  380. 2014-04-21 20:37:01,023 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: Path=hdfs://192.168.10.48:8020/hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083821020, syncFs=true, hflush=false, compression=false
  381. 2014-04-21 20:37:01,023 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 30503980 synced till here 30503959
  382. 2014-04-21 20:37:01,031 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083777726, entries=236011, filesize=63759104. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083821020
  383. 2014-04-21 20:37:01,036 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  384. 2014-04-21 20:37:01,036 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_8194617312737193745_202206
  385. 2014-04-21 20:37:01,036 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  386. 2014-04-21 20:37:02,059 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.25 MB of total=2.31 GB
  387. 2014-04-21 20:37:02,103 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.32 MB, total=2.04 GB, single=464.57 MB, multi=1.83 GB, memory=0.75 KB
  388. 2014-04-21 20:37:08,891 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.24 MB of total=2.31 GB
  389. 2014-04-21 20:37:08,916 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.28 MB, total=2.04 GB, single=476.77 MB, multi=1.82 GB, memory=0.75 KB
  390. 2014-04-21 20:37:16,986 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.26 MB of total=2.31 GB
  391. 2014-04-21 20:37:17,061 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.32 MB, total=2.04 GB, single=471.96 MB, multi=1.83 GB, memory=0.75 KB
  392. 2014-04-21 20:37:23,499 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Flush requested on vc2.url_db,\x1F\xF8Ti?\x95\xF4\xFA\x11\x7Fm\x97\xD9\xC6\xFF\x88,1398079780746.c156f751154ba579f17f9fbb20aa88e5.
  393. 2014-04-21 20:37:23,500 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Started memstore flush for vc2.url_db,\x1F\xF8Ti?\x95\xF4\xFA\x11\x7Fm\x97\xD9\xC6\xFF\x88,1398079780746.c156f751154ba579f17f9fbb20aa88e5., current region memstore size 128.0m
  394. 2014-04-21 20:37:23,518 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting vc2.url_db,\x1F\xF8Ti?\x95\xF4\xFA\x11\x7Fm\x97\xD9\xC6\xFF\x88,1398079780746.c156f751154ba579f17f9fbb20aa88e5., commencing wait for mvcc, flushsize=134230424
  395. 2014-04-21 20:37:23,518 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting, commencing flushing stores
  396. 2014-04-21 20:37:23,791 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/.tmp/e5ebae2c840946d9a545f1c933e72296 with permission=rwxrwxrwx
  397. 2014-04-21 20:37:23,791 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  398. 2014-04-21 20:37:23,791 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  399. 2014-04-21 20:37:23,794 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  400. 2014-04-21 20:37:23,794 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/.tmp/e5ebae2c840946d9a545f1c933e72296: CompoundBloomFilterWriter
  401. 2014-04-21 20:37:23,802 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  402. 2014-04-21 20:37:23,802 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_2272639918379176517_202209
  403. 2014-04-21 20:37:23,804 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  404. 2014-04-21 20:37:24,782 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/.tmp/e5ebae2c840946d9a545f1c933e72296)
  405. 2014-04-21 20:37:24,782 INFO org.apache.hadoop.hbase.regionserver.Store: Flushed , sequenceid=32219384, memsize=128.0m, into tmp file hdfs://192.168.10.48:8020/hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/.tmp/e5ebae2c840946d9a545f1c933e72296
  406. 2014-04-21 20:37:24,796 DEBUG org.apache.hadoop.hbase.regionserver.Store: Renaming flushed file at hdfs://192.168.10.48:8020/hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/.tmp/e5ebae2c840946d9a545f1c933e72296 to hdfs://192.168.10.48:8020/hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/cf/e5ebae2c840946d9a545f1c933e72296
  407. 2014-04-21 20:37:24,810 INFO org.apache.hadoop.hbase.regionserver.Store: Added hdfs://192.168.10.48:8020/hbase/vc2.url_db/c156f751154ba579f17f9fbb20aa88e5/cf/e5ebae2c840946d9a545f1c933e72296, entries=703920, sequenceid=32219384, filesize=35.3m
  408. 2014-04-21 20:37:24,831 INFO org.apache.hadoop.hbase.regionserver.HRegion: Finished memstore flush of ~128.0m/134230424, currentsize=601.6k/616024 for region vc2.url_db,\x1F\xF8Ti?\x95\xF4\xFA\x11\x7Fm\x97\xD9\xC6\xFF\x88,1398079780746.c156f751154ba579f17f9fbb20aa88e5. in 1331ms, sequenceid=32219384, compaction requested=false
  409. 2014-04-21 20:37:25,219 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.29 MB of total=2.31 GB
  410. 2014-04-21 20:37:25,261 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.35 MB, total=2.04 GB, single=492.3 MB, multi=1.81 GB, memory=0.75 KB
  411. 2014-04-21 20:37:32,490 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.27 MB of total=2.31 GB
  412. 2014-04-21 20:37:32,559 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.28 MB, total=2.04 GB, single=493.04 MB, multi=1.81 GB, memory=0.75 KB
  413. 2014-04-21 20:37:37,565 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Stats: total=2.23 GB, free=500.78 MB, max=2.72 GB, blocks=35807, accesses=105495461, hits=103343867, hitRatio=97.96%, , cachingAccesses=103416737, cachingHits=102848387, cachingHitsRatio=99.45%, , evictions=75, evicted=529542, evictedPerRun=7060.56005859375
  414. 2014-04-21 20:37:40,232 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.21 MB of total=2.31 GB
  415. 2014-04-21 20:37:40,299 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=499.91 MB, multi=1.8 GB, memory=0.75 KB
  416. 2014-04-21 20:37:44,380 DEBUG org.apache.hadoop.hbase.regionserver.LogRoller: HLog roll requested
  417. 2014-04-21 20:37:44,380 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  418. 2014-04-21 20:37:44,380 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  419. 2014-04-21 20:37:44,388 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: using new createWriter -- HADOOP-6840
  420. 2014-04-21 20:37:44,388 DEBUG org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter: Path=hdfs://192.168.10.48:8020/hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083864380, syncFs=true, hflush=false, compression=false
  421. 2014-04-21 20:37:44,388 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLog: cleanupCurrentWriter waiting for transactions to get synced total 30737720 synced till here 30737679
  422. 2014-04-21 20:37:44,404 INFO org.apache.hadoop.hbase.regionserver.wal.HLog: Roll /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083821020, entries=233740, filesize=63770249. for /hbase/.logs/app-hbase-1,60020,1392084194869/app-hbase-1%2C60020%2C1392084194869.1398083864380
  423. 2014-04-21 20:37:44,409 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  424. 2014-04-21 20:37:44,409 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_-7898058268572292438_202211
  425. 2014-04-21 20:37:44,411 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  426. 2014-04-21 20:37:47,348 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.21 MB of total=2.31 GB
  427. 2014-04-21 20:37:47,397 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=506.76 MB, multi=1.79 GB, memory=0.75 KB
  428. 2014-04-21 20:37:55,076 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.23 MB of total=2.31 GB
  429. 2014-04-21 20:37:55,108 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.27 MB, total=2.04 GB, single=505.7 MB, multi=1.79 GB, memory=0.75 KB
  430. 2014-04-21 20:38:00,646 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Flush requested on vc2.out_link,\x1FQ\x0D\xF5_\xA9\x1F\x92S\x8F\x94\xAD\xA4\xCF\x04\xFB\xDC\x8E\xCCSo\x81\xF8#U:\xF3\xA8T\xC2\x9C\xCC,1398080784656.cbebd7c0e62b58879caf08915a66a8be.
  431. 2014-04-21 20:38:00,646 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Started memstore flush for vc2.out_link,\x1FQ\x0D\xF5_\xA9\x1F\x92S\x8F\x94\xAD\xA4\xCF\x04\xFB\xDC\x8E\xCCSo\x81\xF8#U:\xF3\xA8T\xC2\x9C\xCC,1398080784656.cbebd7c0e62b58879caf08915a66a8be., current region memstore size 128.0m
  432. 2014-04-21 20:38:00,655 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting vc2.out_link,\x1FQ\x0D\xF5_\xA9\x1F\x92S\x8F\x94\xAD\xA4\xCF\x04\xFB\xDC\x8E\xCCSo\x81\xF8#U:\xF3\xA8T\xC2\x9C\xCC,1398080784656.cbebd7c0e62b58879caf08915a66a8be., commencing wait for mvcc, flushsize=134224056
  433. 2014-04-21 20:38:00,655 DEBUG org.apache.hadoop.hbase.regionserver.HRegion: Finished snapshotting, commencing flushing stores
  434. 2014-04-21 20:38:00,790 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creating file=hdfs://192.168.10.48:8020/hbase/vc2.out_link/cbebd7c0e62b58879caf08915a66a8be/.tmp/27e034a2929a41369517ab67d1b5e466 with permission=rwxrwxrwx
  435. 2014-04-21 20:38:00,791 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultReplication
  436. 2014-04-21 20:38:00,791 INFO org.apache.hadoop.hbase.util.FSUtils: FileSystem doesn't support getDefaultBlockSize
  437. 2014-04-21 20:38:00,794 DEBUG org.apache.hadoop.hbase.io.hfile.HFileWriterV2: Initialized with CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false]
  438. 2014-04-21 20:38:00,794 INFO org.apache.hadoop.hbase.regionserver.StoreFile: Delete Family Bloom filter type for hdfs://192.168.10.48:8020/hbase/vc2.out_link/cbebd7c0e62b58879caf08915a66a8be/.tmp/27e034a2929a41369517ab67d1b5e466: CompoundBloomFilterWriter
  439. 2014-04-21 20:38:00,799 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream 192.168.10.45:50010 java.net.ConnectException: Connection refused
  440. 2014-04-21 20:38:00,799 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_8196924327904956852_202212
  441. 2014-04-21 20:38:00,800 INFO org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.10.45:50010
  442. 2014-04-21 20:38:01,213 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction started; Attempting to free 278.23 MB of total=2.31 GB
  443. 2014-04-21 20:38:01,255 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Block cache LRU eviction completed; freed=278.24 MB, total=2.04 GB, single=505.58 MB, multi=1.79 GB, memory=0.75 KB
  444. 2014-04-21 20:38:01,710 INFO org.apache.hadoop.hbase.regionserver.StoreFile: NO General Bloom and NO DeleteFamily was added to HFile (hdfs://192.168.10.48:8020/hbase/vc2.out_link/cbebd7c0e62b58879caf08915a66a8be/.tmp/27e034a2929a41369517ab67d1b5e466)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement