This week only. Pastebin PRO Accounts Christmas Special! Don't miss out!Want more features on Pastebin? Sign Up, it's FREE!
Guest

ExportSnapshot Stack Trace

By: a guest on Apr 29th, 2013  |  syntax: None  |  size: 13.92 KB  |  views: 42  |  expires: Never
download  |  raw  |  embed  |  report abuse  |  print
Text below is selected. Please press Ctrl+C to copy to your clipboard. (⌘+C on Mac)
  1. 2013-04-29 16:30:43,382 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
  2. 2013-04-29 16:30:43,752 WARN org.apache.hadoop.conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id
  3. 2013-04-29 16:30:43,753 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
  4. 2013-04-29 16:30:43,792 WARN org.apache.hadoop.conf.Configuration: slave.host.name is deprecated. Instead, use dfs.datanode.hostname
  5. 2013-04-29 16:30:44,092 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
  6. 2013-04-29 16:30:44,096 INFO org.apache.hadoop.mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@10987197
  7. 2013-04-29 16:30:44,311 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://namenode-primary:8020/tmp/hbase-root/exportSnapshot-1367253018375/export-files.1367253018377/export-503.seq:0+883
  8. 2013-04-29 16:30:44,348 INFO org.apache.hadoop.io.compress.zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
  9. 2013-04-29 16:30:44,348 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.deflate]
  10. 2013-04-29 16:30:44,365 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy file input=d/queries=991625ef6c2a3db259dc984c990e823d-29384f58e6964b1a9044590988a390d3 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/991625ef6c2a3db259dc984c990e823d/d/29384f58e6964b1a9044590988a390d3
  11. 2013-04-29 16:30:44,374 WARN org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to get the status for file=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/991625ef6c2a3db259dc984c990e823d/d/29384f58e6964b1a9044590988a390d3
  12. 2013-04-29 16:40:38,059 ERROR org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to set the owner/group for file=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/991625ef6c2a3db259dc984c990e823d/d/29384f58e6964b1a9044590988a390d3
  13. org.apache.hadoop.security.AccessControlException: Non-super user cannot change owner.
  14.     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setOwnerInt(FSNamesystem.java:1180)
  15.     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setOwner(FSNamesystem.java:1155)
  16.     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setOwner(NameNodeRpcServer.java:461)
  17.     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setOwner(ClientNamenodeProtocolServerSideTranslatorPB.java:267)
  18.     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44076)
  19.     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
  20.     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
  21.     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
  22.     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
  23.     at java.security.AccessController.doPrivileged(Native Method)
  24.     at javax.security.auth.Subject.doAs(Subject.java:396)
  25.     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
  26.     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)
  27.  
  28.     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  29.     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
  30.     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
  31.     at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
  32.     at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
  33.     at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
  34.     at org.apache.hadoop.hdfs.DFSClient.setOwner(DFSClient.java:1905)
  35.     at org.apache.hadoop.hdfs.DistributedFileSystem.setOwner(DistributedFileSystem.java:831)
  36.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.preserveAttributes(ExportSnapshot.java:238)
  37.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.copyFile(ExportSnapshot.java:205)
  38.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.map(ExportSnapshot.java:145)
  39.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.map(ExportSnapshot.java:94)
  40.     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
  41.     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673)
  42.     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331)
  43.     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
  44.     at java.security.AccessController.doPrivileged(Native Method)
  45.     at javax.security.auth.Subject.doAs(Subject.java:396)
  46.     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
  47.     at org.apache.hadoop.mapred.Child.main(Child.java:262)
  48. Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Non-super user cannot change owner.
  49.     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setOwnerInt(FSNamesystem.java:1180)
  50.     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setOwner(FSNamesystem.java:1155)
  51.     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setOwner(NameNodeRpcServer.java:461)
  52.     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setOwner(ClientNamenodeProtocolServerSideTranslatorPB.java:267)
  53.     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44076)
  54.     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
  55.     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
  56.     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
  57.     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
  58.     at java.security.AccessController.doPrivileged(Native Method)
  59.     at javax.security.auth.Subject.doAs(Subject.java:396)
  60.     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
  61.     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)
  62.  
  63.     at org.apache.hadoop.ipc.Client.call(Client.java:1225)
  64.     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
  65.     at $Proxy10.setOwner(Unknown Source)
  66.     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  67.     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
  68.     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
  69.     at java.lang.reflect.Method.invoke(Method.java:597)
  70.     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
  71.     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
  72.     at $Proxy10.setOwner(Unknown Source)
  73.     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setOwner(ClientNamenodeProtocolTranslatorPB.java:258)
  74.     at org.apache.hadoop.hdfs.DFSClient.setOwner(DFSClient.java:1903)
  75.     ... 13 more
  76. 2013-04-29 16:40:38,062 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy file input=d/queries=5b3e8544cff2f045c6e89f422500cd6e-bedf8536eb484360992dfc47369b70a7 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/5b3e8544cff2f045c6e89f422500cd6e/d/bedf8536eb484360992dfc47369b70a7
  77. 2013-04-29 16:40:38,068 WARN org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to get the status for file=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/5b3e8544cff2f045c6e89f422500cd6e/d/bedf8536eb484360992dfc47369b70a7
  78. 2013-04-29 16:40:40,729 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy completed for input=d/queries=5b3e8544cff2f045c6e89f422500cd6e-bedf8536eb484360992dfc47369b70a7 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/5b3e8544cff2f045c6e89f422500cd6e/d/bedf8536eb484360992dfc47369b70a7
  79. 2013-04-29 16:40:40,730 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy file input=d/queries=4689deb6f26e82091aa127587df4777a-f9bc7e65a180459da53712205b991905 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/4689deb6f26e82091aa127587df4777a/d/f9bc7e65a180459da53712205b991905
  80. 2013-04-29 16:40:40,734 WARN org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to get the status for file=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/4689deb6f26e82091aa127587df4777a/d/f9bc7e65a180459da53712205b991905
  81. 2013-04-29 16:40:41,766 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy completed for input=d/queries=4689deb6f26e82091aa127587df4777a-f9bc7e65a180459da53712205b991905 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/4689deb6f26e82091aa127587df4777a/d/f9bc7e65a180459da53712205b991905
  82. 2013-04-29 16:40:41,766 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy file input=d/queries=03f6a36fef8c48f956272c75b1e45666-cb14e43f2f5a46ee96b2cbf389987d80 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/03f6a36fef8c48f956272c75b1e45666/d/cb14e43f2f5a46ee96b2cbf389987d80
  83. 2013-04-29 16:40:41,769 WARN org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to get the status for file=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/03f6a36fef8c48f956272c75b1e45666/d/cb14e43f2f5a46ee96b2cbf389987d80
  84. 2013-04-29 16:40:43,627 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy completed for input=d/queries=03f6a36fef8c48f956272c75b1e45666-cb14e43f2f5a46ee96b2cbf389987d80 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/03f6a36fef8c48f956272c75b1e45666/d/cb14e43f2f5a46ee96b2cbf389987d80
  85. 2013-04-29 16:40:43,628 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy file input=d/queries=11d8de1728f7211a7cecb3f18531d317-367c5196865d4930aeb9f738ee5ea2c2 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/11d8de1728f7211a7cecb3f18531d317/d/367c5196865d4930aeb9f738ee5ea2c2
  86. 2013-04-29 16:40:43,636 WARN org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to get the status for file=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/11d8de1728f7211a7cecb3f18531d317/d/367c5196865d4930aeb9f738ee5ea2c2
  87. 2013-04-29 16:40:44,819 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy completed for input=d/queries=11d8de1728f7211a7cecb3f18531d317-367c5196865d4930aeb9f738ee5ea2c2 output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/11d8de1728f7211a7cecb3f18531d317/d/367c5196865d4930aeb9f738ee5ea2c2
  88. 2013-04-29 16:40:44,819 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy file input=d/queries=e54db7771e12e7103c45189aa9f6d621-a433ab2f363545c5909ed21d579e302c output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/e54db7771e12e7103c45189aa9f6d621/d/a433ab2f363545c5909ed21d579e302c
  89. 2013-04-29 16:40:44,826 WARN org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to get the status for file=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/e54db7771e12e7103c45189aa9f6d621/d/a433ab2f363545c5909ed21d579e302c
  90. 2013-04-29 16:40:45,967 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy completed for input=d/queries=e54db7771e12e7103c45189aa9f6d621-a433ab2f363545c5909ed21d579e302c output=hdfs://namenode-backup:8020/users/sean/hbase_test/.archive/queries/e54db7771e12e7103c45189aa9f6d621/d/a433ab2f363545c5909ed21d579e302c
  91. 2013-04-29 16:40:45,967 INFO org.apache.hadoop.hbase.snapshot.ExportSnapshot: copy file input=d/queries=ae58ff4e7149b14efe6e5a0cc1d2f338-4b713d79670c4fd9aa309320adcb0047.e6dd75e2b917adcd657f5294093f0edd output=hdfs://namenode-backup:8020/users/sean/hbase_test/.oldlogs/queries=ae58ff4e7149b14efe6e5a0cc1d2f338-4b713d79670c4fd9aa309320adcb0047.e6dd75e2b917adcd657f5294093f0edd
  92. 2013-04-29 16:40:45,973 ERROR org.apache.hadoop.hbase.snapshot.ExportSnapshot: Unable to open source file=d/queries=ae58ff4e7149b14efe6e5a0cc1d2f338-4b713d79670c4fd9aa309320adcb0047.e6dd75e2b917adcd657f5294093f0edd
  93. java.io.FileNotFoundException: Unable to open link: org.apache.hadoop.hbase.io.HLogLink locations=[hdfs://namenode-primary:8020/hbase/.logs/d/queries=ae58ff4e7149b14efe6e5a0cc1d2f338-4b713d79670c4fd9aa309320adcb0047.e6dd75e2b917adcd657f5294093f0edd, hdfs://namenode-primary:8020/hbase/.oldlogs/queries=ae58ff4e7149b14efe6e5a0cc1d2f338-4b713d79670c4fd9aa309320adcb0047.e6dd75e2b917adcd657f5294093f0edd]
  94.     at org.apache.hadoop.hbase.io.FileLink$FileLinkInputStream.tryOpen(FileLink.java:304)
  95.     at org.apache.hadoop.hbase.io.FileLink$FileLinkInputStream.<init>(FileLink.java:119)
  96.     at org.apache.hadoop.hbase.io.FileLink$FileLinkInputStream.<init>(FileLink.java:110)
  97.     at org.apache.hadoop.hbase.io.FileLink.open(FileLink.java:389)
  98.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.openSourceFile(ExportSnapshot.java:303)
  99.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.copyFile(ExportSnapshot.java:174)
  100.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.map(ExportSnapshot.java:145)
  101.     at org.apache.hadoop.hbase.snapshot.ExportSnapshot$ExportMapper.map(ExportSnapshot.java:94)
  102.     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
  103.     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673)
  104.     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331)
  105.     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
  106.     at java.security.AccessController.doPrivileged(Native Method)
  107.     at javax.security.auth.Subject.doAs(Subject.java:396)
  108.     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
  109.     at org.apache.hadoop.mapred.Child.main(Child.java:262)
  110. 2013-04-29 16:40:45,975 INFO org.apache.hadoop.mapred.Task: Task:attempt_201304260021_3855_m_000007_0 is done. And is in the process of commiting
  111. 2013-04-29 16:40:45,999 INFO org.apache.hadoop.mapred.Task: Task 'attempt_201304260021_3855_m_000007_0' done.
clone this paste RAW Paste Data