Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- SLF4J: Class path contains multiple SLF4J bindings.
- SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/spark/lib/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/phoenix/phoenix-4.7.0.2.5.3.0-37-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
- SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
- 17/07/13 20:53:05 INFO ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
- 17/07/13 20:53:05 DEBUG Shell: setsid exited with exit code 0
- 17/07/13 20:53:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
- 17/07/13 20:53:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
- 17/07/13 20:53:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups])
- 17/07/13 20:53:06 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
- 17/07/13 20:53:06 DEBUG SecurityUtil: Setting hadoop.security.token.service.use_ip to true
- 17/07/13 20:53:06 DEBUG Groups: Creating new Groups object
- 17/07/13 20:53:06 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
- 17/07/13 20:53:06 DEBUG NativeCodeLoader: Loaded the native-hadoop library
- 17/07/13 20:53:06 DEBUG JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
- 17/07/13 20:53:06 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
- 17/07/13 20:53:06 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
- 17/07/13 20:53:06 DEBUG YarnSparkHadoopUtil: running as user: hermesadmin
- 17/07/13 20:53:06 DEBUG UserGroupInformation: hadoop login
- 17/07/13 20:53:06 DEBUG UserGroupInformation: hadoop login commit
- 17/07/13 20:53:06 DEBUG UserGroupInformation: using kerberos user:hermesadmin@FREMONT.LAMRC.NET
- 17/07/13 20:53:06 DEBUG UserGroupInformation: Using user: "hermesadmin@FREMONT.LAMRC.NET" with name hermesadmin@FREMONT.LAMRC.NET
- 17/07/13 20:53:06 DEBUG UserGroupInformation: User entry: "hermesadmin@FREMONT.LAMRC.NET"
- 17/07/13 20:53:06 DEBUG UserGroupInformation: Reading credentials from location set in HADOOP_TOKEN_FILE_LOCATION: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/container_tokens
- 17/07/13 20:53:06 DEBUG UserGroupInformation: Loaded 4 tokens
- 17/07/13 20:53:06 DEBUG UserGroupInformation: UGI loginUser:hermesadmin@FREMONT.LAMRC.NET (auth:KERBEROS)
- 17/07/13 20:53:06 DEBUG UserGroupInformation: PrivilegedAction as:hermesadmin (auth:SIMPLE) from:org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
- 17/07/13 20:53:06 DEBUG UserGroupInformation: Found tgt Ticket (hex) =
- 0000: 61 82 01 55 30 82 01 51 A0 03 02 01 05 A1 13 1B a..U0..Q........
- 0010: 11 46 52 45 4D 4F 4E 54 2E 4C 41 4D 52 43 2E 4E .FREMONT.LAMRC.N
- 0020: 45 54 A2 26 30 24 A0 03 02 01 02 A1 1D 30 1B 1B ET.&0$.......0..
- 0030: 06 6B 72 62 74 67 74 1B 11 46 52 45 4D 4F 4E 54 .krbtgt..FREMONT
- 0040: 2E 4C 41 4D 52 43 2E 4E 45 54 A3 82 01 0B 30 82 .LAMRC.NET....0.
- 0050: 01 07 A0 03 02 01 12 A1 03 02 01 01 A2 81 FA 04 ................
- 0060: 81 F7 E2 EA 0C 63 1D 11 4B A0 05 11 5E 1B 7D 36 .....c..K...^..6
- 0070: 9E 97 75 05 55 29 AE F3 DC 96 C2 35 7C A4 E0 11 ..u.U).....5....
- 0080: 3A 0B DC 1A 61 27 EE A1 8A 7A 5A 30 CB DC 15 4C :...a'...zZ0...L
- 0090: 82 7D F6 D4 E9 FC BD B8 6D 0B 18 72 5E 21 36 7C ........m..r^!6.
- 00A0: A9 C0 14 66 BA 4B 98 66 8C EA 93 DE 8F FC CC 9C ...f.K.f........
- 00B0: AE DB F4 FB B6 15 DA D8 A4 97 4A F3 89 DB 71 25 ..........J...q%
- 00C0: 31 85 3B F1 6D 30 B9 4A 02 61 C1 35 AE BE CD F8 1.;.m0.J.a.5....
- 00D0: D0 9C B5 43 62 2B 32 B3 8E F5 60 6C E8 E4 51 76 ...Cb+2...`l..Qv
- 00E0: 4D 34 FC EB 53 DD 89 FD 3E 08 87 EF 64 93 7B E7 M4..S...>...d...
- 00F0: 3B 35 E0 33 BC 74 ED 4F 0E DB 0A ED FE 34 B2 FF ;5.3.t.O.....4..
- 0100: 6B 9F 06 B2 68 60 D0 78 51 D6 A2 2B C6 17 85 21 k...h`.xQ..+...!
- 0110: 1F 89 67 2A B8 1D B0 0A F1 87 BE 6A F8 70 62 84 ..g*.......j.pb.
- 0120: 0E 25 75 F5 AC 1A E9 E2 C9 12 76 41 B9 58 9F 4B .%u.......vA.X.K
- 0130: 2B BF BC 39 7A D4 5C DD 6A 2A 41 AB 7C 4F 2C EC +..9z.\.j*A..O,.
- 0140: 57 56 43 96 6A 0D 81 51 1A B8 20 EB F5 C7 15 07 WVC.j..Q.. .....
- 0150: F2 19 E1 B8 33 B0 26 F8 10 ....3.&..
- Client Principal = hermesadmin@FREMONT.LAMRC.NET
- Server Principal = krbtgt/FREMONT.LAMRC.NET@FREMONT.LAMRC.NET
- Session Key = EncryptionKey: keyType=18 keyBytes (hex dump)=
- 0000: 98 D5 95 25 FE E0 E1 A6 EC 64 E7 1D 7D 20 0E 53 ...%.....d... .S
- 0010: CC 52 44 12 8E 4A 5A 58 DA 91 58 02 82 CB 7C 11 .RD..JZX..X.....
- Forwardable Ticket true
- Forwarded Ticket false
- Proxiable Ticket false
- Proxy Ticket false
- Postdated Ticket false
- Renewable Ticket false
- Initial Ticket false
- Auth Time = Thu Jul 13 18:45:56 PDT 2017
- Start Time = Thu Jul 13 18:45:56 PDT 2017
- End Time = Fri Jul 14 18:45:56 PDT 2017
- Renew Till = null
- Client Addresses Null
- 17/07/13 20:53:06 DEBUG UserGroupInformation: Current time is 1500004386884
- 17/07/13 20:53:06 DEBUG UserGroupInformation: Next refresh is 1500065876000
- 17/07/13 20:53:07 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1499358861755_0103_000001
- 17/07/13 20:53:07 DEBUG : address: ldtchsc06.fremont.lamrc.net/167.191.118.196 isLoopbackAddress: false, with host 167.191.118.196 ldtchsc06.fremont.lamrc.net
- 17/07/13 20:53:07 DEBUG InternalLoggerFactory: Using SLF4J as the default logging framework
- 17/07/13 20:53:07 DEBUG PlatformDependent0: java.nio.Buffer.address: available
- 17/07/13 20:53:07 DEBUG PlatformDependent0: sun.misc.Unsafe.theUnsafe: available
- 17/07/13 20:53:07 DEBUG PlatformDependent0: sun.misc.Unsafe.copyMemory: available
- 17/07/13 20:53:07 DEBUG PlatformDependent0: java.nio.Bits.unaligned: true
- 17/07/13 20:53:07 DEBUG PlatformDependent: Java version: 8
- 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.noUnsafe: false
- 17/07/13 20:53:07 DEBUG PlatformDependent: sun.misc.Unsafe: available
- 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.noJavassist: false
- 17/07/13 20:53:07 DEBUG PlatformDependent: Javassist: available
- 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.tmpdir: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/tmp (java.io.tmpdir)
- 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.bitMode: 64 (sun.arch.data.model)
- 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.noPreferDirect: false
- 17/07/13 20:53:07 DEBUG NativeLibraryLoader: -Dio.netty.tmpdir: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/tmp (java.io.tmpdir)
- 17/07/13 20:53:07 DEBUG NativeLibraryLoader: -Dio.netty.netty.workdir: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/tmp (io.netty.tmpdir)
- 17/07/13 20:53:07 DEBUG NetUtil: Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
- 17/07/13 20:53:07 DEBUG NetUtil: /proc/sys/net/core/somaxconn: 128
- 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
- 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.client.read.shortcircuit = true
- 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
- 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket
- 17/07/13 20:53:07 DEBUG RetryUtils: multipleLinearRandomRetry = null
- 17/07/13 20:53:08 DEBUG Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@6475472c
- 17/07/13 20:53:08 DEBUG Client: getting client out of cache: org.apache.hadoop.ipc.Client@43aaf813
- 17/07/13 20:53:08 DEBUG DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@3f0dd7d3: starting with interruptCheckPeriodMs = 60000
- 17/07/13 20:53:08 DEBUG DomainSocketFactory: The short-circuit local reads feature is enabled.
- 17/07/13 20:53:08 DEBUG DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
- 17/07/13 20:53:08 INFO SecurityManager: Changing view acls to: hermesadmin
- 17/07/13 20:53:08 INFO SecurityManager: Changing modify acls to: hermesadmin
- 17/07/13 20:53:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hermesadmin); users with modify permissions: Set(hermesadmin)
- 17/07/13 20:53:08 DEBUG SSLOptions: No SSL protocol specified
- 17/07/13 20:53:08 DEBUG SSLOptions: No SSL protocol specified
- 17/07/13 20:53:08 DEBUG SSLOptions: No SSL protocol specified
- 17/07/13 20:53:08 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
- 17/07/13 20:53:08 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
- 17/07/13 20:53:08 INFO ApplicationMaster: Starting the user application in a separate Thread
- 17/07/13 20:53:08 INFO ApplicationMaster: Waiting for spark context initialization
- 17/07/13 20:53:08 INFO ApplicationMaster: Waiting for spark context initialization ...
- 17/07/13 20:53:09 INFO HBaseConfigUtil: HBase configuration - zookeeper ip ldtchsc06.fremont.lamrc.net port num is 2181
- org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
- Thu Jul 13 20:54:01 PDT 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=69694: row 'PROCESSDATA,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=ldtchsc06.fremont.lamrc.net,16020,1499810738597, seqNum=0
- at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:271)
- at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:210)
- at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
- at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
- at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
- at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
- at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
- at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
- at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
- at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:193)
- at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:89)
- at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.isTableAvailable(ConnectionManager.java:992)
- at org.apache.hadoop.hbase.client.HBaseAdmin.isTableAvailable(HBaseAdmin.java:1524)
- at com.lam.hbaseservice.CreateTable.CreateHBaseTable(CreateTable.java:44)
- at com.lam.hbaseservice.HBaseService.createHBaseTable(HBaseService.java:60)
- at com.lam.app.StdDataConsumerRunnable.run(StdDataConsumerRunnable.java:70)
- at java.lang.Thread.run(Thread.java:745)
- Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=69694: row 'PROCESSDATA,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=ldtchsc06.fremont.lamrc.net,16020,1499810738597, seqNum=0
- at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
- at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- ... 1 more
- Caused by: java.io.IOException: Could not set up IO Streams to ldtchsc06.fremont.lamrc.net/167.191.118.196:16020
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:779)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:887)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1199)
- at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
- at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
- at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32741)
- at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:379)
- at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:201)
- at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:63)
- at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
- at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:364)
- at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:338)
- at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
- ... 4 more
- Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:422)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
- ... 17 more
- Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
- at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
- at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:422)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
- at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734)
- ... 17 more
- Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
- at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
- at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
- at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
- at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
- at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
- at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
- at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
- ... 26 more
- 17/07/13 20:54:01 INFO HBaseConfigUtil: Closing HBase DB connection
- 17/07/13 20:54:08 INFO StdDataConsumerRunnable: Stopping Hbase service app
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement