Advertisement
Guest User

Untitled

a guest
Jul 13th, 2017
573
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 17.21 KB | None | 0 0
  1. SLF4J: Class path contains multiple SLF4J bindings.
  2. SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/spark/lib/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  3. SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/phoenix/phoenix-4.7.0.2.5.3.0-37-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  4. SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  5. SLF4J: Found binding in [jar:file:/usr/hdp/2.5.3.0-37/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  6. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  7. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  8. 17/07/13 20:53:05 INFO ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
  9. 17/07/13 20:53:05 DEBUG Shell: setsid exited with exit code 0
  10. 17/07/13 20:53:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
  11. 17/07/13 20:53:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
  12. 17/07/13 20:53:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups])
  13. 17/07/13 20:53:06 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
  14. 17/07/13 20:53:06 DEBUG SecurityUtil: Setting hadoop.security.token.service.use_ip to true
  15. 17/07/13 20:53:06 DEBUG Groups: Creating new Groups object
  16. 17/07/13 20:53:06 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
  17. 17/07/13 20:53:06 DEBUG NativeCodeLoader: Loaded the native-hadoop library
  18. 17/07/13 20:53:06 DEBUG JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
  19. 17/07/13 20:53:06 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
  20. 17/07/13 20:53:06 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
  21. 17/07/13 20:53:06 DEBUG YarnSparkHadoopUtil: running as user: hermesadmin
  22. 17/07/13 20:53:06 DEBUG UserGroupInformation: hadoop login
  23. 17/07/13 20:53:06 DEBUG UserGroupInformation: hadoop login commit
  24. 17/07/13 20:53:06 DEBUG UserGroupInformation: using kerberos user:hermesadmin@FREMONT.LAMRC.NET
  25. 17/07/13 20:53:06 DEBUG UserGroupInformation: Using user: "hermesadmin@FREMONT.LAMRC.NET" with name hermesadmin@FREMONT.LAMRC.NET
  26. 17/07/13 20:53:06 DEBUG UserGroupInformation: User entry: "hermesadmin@FREMONT.LAMRC.NET"
  27. 17/07/13 20:53:06 DEBUG UserGroupInformation: Reading credentials from location set in HADOOP_TOKEN_FILE_LOCATION: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/container_tokens
  28. 17/07/13 20:53:06 DEBUG UserGroupInformation: Loaded 4 tokens
  29. 17/07/13 20:53:06 DEBUG UserGroupInformation: UGI loginUser:hermesadmin@FREMONT.LAMRC.NET (auth:KERBEROS)
  30. 17/07/13 20:53:06 DEBUG UserGroupInformation: PrivilegedAction as:hermesadmin (auth:SIMPLE) from:org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
  31. 17/07/13 20:53:06 DEBUG UserGroupInformation: Found tgt Ticket (hex) =
  32. 0000: 61 82 01 55 30 82 01 51 A0 03 02 01 05 A1 13 1B a..U0..Q........
  33. 0010: 11 46 52 45 4D 4F 4E 54 2E 4C 41 4D 52 43 2E 4E .FREMONT.LAMRC.N
  34. 0020: 45 54 A2 26 30 24 A0 03 02 01 02 A1 1D 30 1B 1B ET.&0$.......0..
  35. 0030: 06 6B 72 62 74 67 74 1B 11 46 52 45 4D 4F 4E 54 .krbtgt..FREMONT
  36. 0040: 2E 4C 41 4D 52 43 2E 4E 45 54 A3 82 01 0B 30 82 .LAMRC.NET....0.
  37. 0050: 01 07 A0 03 02 01 12 A1 03 02 01 01 A2 81 FA 04 ................
  38. 0060: 81 F7 E2 EA 0C 63 1D 11 4B A0 05 11 5E 1B 7D 36 .....c..K...^..6
  39. 0070: 9E 97 75 05 55 29 AE F3 DC 96 C2 35 7C A4 E0 11 ..u.U).....5....
  40. 0080: 3A 0B DC 1A 61 27 EE A1 8A 7A 5A 30 CB DC 15 4C :...a'...zZ0...L
  41. 0090: 82 7D F6 D4 E9 FC BD B8 6D 0B 18 72 5E 21 36 7C ........m..r^!6.
  42. 00A0: A9 C0 14 66 BA 4B 98 66 8C EA 93 DE 8F FC CC 9C ...f.K.f........
  43. 00B0: AE DB F4 FB B6 15 DA D8 A4 97 4A F3 89 DB 71 25 ..........J...q%
  44. 00C0: 31 85 3B F1 6D 30 B9 4A 02 61 C1 35 AE BE CD F8 1.;.m0.J.a.5....
  45. 00D0: D0 9C B5 43 62 2B 32 B3 8E F5 60 6C E8 E4 51 76 ...Cb+2...`l..Qv
  46. 00E0: 4D 34 FC EB 53 DD 89 FD 3E 08 87 EF 64 93 7B E7 M4..S...>...d...
  47. 00F0: 3B 35 E0 33 BC 74 ED 4F 0E DB 0A ED FE 34 B2 FF ;5.3.t.O.....4..
  48. 0100: 6B 9F 06 B2 68 60 D0 78 51 D6 A2 2B C6 17 85 21 k...h`.xQ..+...!
  49. 0110: 1F 89 67 2A B8 1D B0 0A F1 87 BE 6A F8 70 62 84 ..g*.......j.pb.
  50. 0120: 0E 25 75 F5 AC 1A E9 E2 C9 12 76 41 B9 58 9F 4B .%u.......vA.X.K
  51. 0130: 2B BF BC 39 7A D4 5C DD 6A 2A 41 AB 7C 4F 2C EC +..9z.\.j*A..O,.
  52. 0140: 57 56 43 96 6A 0D 81 51 1A B8 20 EB F5 C7 15 07 WVC.j..Q.. .....
  53. 0150: F2 19 E1 B8 33 B0 26 F8 10 ....3.&..
  54.  
  55. Client Principal = hermesadmin@FREMONT.LAMRC.NET
  56. Server Principal = krbtgt/FREMONT.LAMRC.NET@FREMONT.LAMRC.NET
  57. Session Key = EncryptionKey: keyType=18 keyBytes (hex dump)=
  58. 0000: 98 D5 95 25 FE E0 E1 A6 EC 64 E7 1D 7D 20 0E 53 ...%.....d... .S
  59. 0010: CC 52 44 12 8E 4A 5A 58 DA 91 58 02 82 CB 7C 11 .RD..JZX..X.....
  60.  
  61.  
  62. Forwardable Ticket true
  63. Forwarded Ticket false
  64. Proxiable Ticket false
  65. Proxy Ticket false
  66. Postdated Ticket false
  67. Renewable Ticket false
  68. Initial Ticket false
  69. Auth Time = Thu Jul 13 18:45:56 PDT 2017
  70. Start Time = Thu Jul 13 18:45:56 PDT 2017
  71. End Time = Fri Jul 14 18:45:56 PDT 2017
  72. Renew Till = null
  73. Client Addresses Null
  74. 17/07/13 20:53:06 DEBUG UserGroupInformation: Current time is 1500004386884
  75. 17/07/13 20:53:06 DEBUG UserGroupInformation: Next refresh is 1500065876000
  76. 17/07/13 20:53:07 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1499358861755_0103_000001
  77. 17/07/13 20:53:07 DEBUG : address: ldtchsc06.fremont.lamrc.net/167.191.118.196 isLoopbackAddress: false, with host 167.191.118.196 ldtchsc06.fremont.lamrc.net
  78. 17/07/13 20:53:07 DEBUG InternalLoggerFactory: Using SLF4J as the default logging framework
  79. 17/07/13 20:53:07 DEBUG PlatformDependent0: java.nio.Buffer.address: available
  80. 17/07/13 20:53:07 DEBUG PlatformDependent0: sun.misc.Unsafe.theUnsafe: available
  81. 17/07/13 20:53:07 DEBUG PlatformDependent0: sun.misc.Unsafe.copyMemory: available
  82. 17/07/13 20:53:07 DEBUG PlatformDependent0: java.nio.Bits.unaligned: true
  83. 17/07/13 20:53:07 DEBUG PlatformDependent: Java version: 8
  84. 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.noUnsafe: false
  85. 17/07/13 20:53:07 DEBUG PlatformDependent: sun.misc.Unsafe: available
  86. 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.noJavassist: false
  87. 17/07/13 20:53:07 DEBUG PlatformDependent: Javassist: available
  88. 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.tmpdir: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/tmp (java.io.tmpdir)
  89. 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.bitMode: 64 (sun.arch.data.model)
  90. 17/07/13 20:53:07 DEBUG PlatformDependent: -Dio.netty.noPreferDirect: false
  91. 17/07/13 20:53:07 DEBUG NativeLibraryLoader: -Dio.netty.tmpdir: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/tmp (java.io.tmpdir)
  92. 17/07/13 20:53:07 DEBUG NativeLibraryLoader: -Dio.netty.netty.workdir: /hadoop/yarn/local/usercache/hermesadmin/appcache/application_1499358861755_0103/container_e24_1499358861755_0103_01_000001/tmp (io.netty.tmpdir)
  93. 17/07/13 20:53:07 DEBUG NetUtil: Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
  94. 17/07/13 20:53:07 DEBUG NetUtil: /proc/sys/net/core/somaxconn: 128
  95. 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
  96. 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.client.read.shortcircuit = true
  97. 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
  98. 17/07/13 20:53:07 DEBUG BlockReaderLocal: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket
  99. 17/07/13 20:53:07 DEBUG RetryUtils: multipleLinearRandomRetry = null
  100. 17/07/13 20:53:08 DEBUG Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@6475472c
  101. 17/07/13 20:53:08 DEBUG Client: getting client out of cache: org.apache.hadoop.ipc.Client@43aaf813
  102. 17/07/13 20:53:08 DEBUG DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@3f0dd7d3: starting with interruptCheckPeriodMs = 60000
  103. 17/07/13 20:53:08 DEBUG DomainSocketFactory: The short-circuit local reads feature is enabled.
  104. 17/07/13 20:53:08 DEBUG DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
  105. 17/07/13 20:53:08 INFO SecurityManager: Changing view acls to: hermesadmin
  106. 17/07/13 20:53:08 INFO SecurityManager: Changing modify acls to: hermesadmin
  107. 17/07/13 20:53:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hermesadmin); users with modify permissions: Set(hermesadmin)
  108. 17/07/13 20:53:08 DEBUG SSLOptions: No SSL protocol specified
  109. 17/07/13 20:53:08 DEBUG SSLOptions: No SSL protocol specified
  110. 17/07/13 20:53:08 DEBUG SSLOptions: No SSL protocol specified
  111. 17/07/13 20:53:08 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
  112. 17/07/13 20:53:08 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
  113. 17/07/13 20:53:08 INFO ApplicationMaster: Starting the user application in a separate Thread
  114. 17/07/13 20:53:08 INFO ApplicationMaster: Waiting for spark context initialization
  115. 17/07/13 20:53:08 INFO ApplicationMaster: Waiting for spark context initialization ...
  116. 17/07/13 20:53:09 INFO HBaseConfigUtil: HBase configuration - zookeeper ip ldtchsc06.fremont.lamrc.net port num is 2181
  117. org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
  118. Thu Jul 13 20:54:01 PDT 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=69694: row 'PROCESSDATA,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=ldtchsc06.fremont.lamrc.net,16020,1499810738597, seqNum=0
  119.  
  120. at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:271)
  121. at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:210)
  122. at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
  123. at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
  124. at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
  125. at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
  126. at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
  127. at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
  128. at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
  129. at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:193)
  130. at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:89)
  131. at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.isTableAvailable(ConnectionManager.java:992)
  132. at org.apache.hadoop.hbase.client.HBaseAdmin.isTableAvailable(HBaseAdmin.java:1524)
  133. at com.lam.hbaseservice.CreateTable.CreateHBaseTable(CreateTable.java:44)
  134. at com.lam.hbaseservice.HBaseService.createHBaseTable(HBaseService.java:60)
  135. at com.lam.app.StdDataConsumerRunnable.run(StdDataConsumerRunnable.java:70)
  136. at java.lang.Thread.run(Thread.java:745)
  137. Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=69694: row 'PROCESSDATA,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=ldtchsc06.fremont.lamrc.net,16020,1499810738597, seqNum=0
  138. at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
  139. at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65)
  140. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  141. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  142. ... 1 more
  143. Caused by: java.io.IOException: Could not set up IO Streams to ldtchsc06.fremont.lamrc.net/167.191.118.196:16020
  144. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:779)
  145. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:887)
  146. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856)
  147. at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1199)
  148. at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
  149. at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
  150. at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32741)
  151. at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:379)
  152. at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:201)
  153. at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:63)
  154. at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
  155. at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:364)
  156. at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:338)
  157. at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
  158. ... 4 more
  159. Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
  160. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
  161. at java.security.AccessController.doPrivileged(Native Method)
  162. at javax.security.auth.Subject.doAs(Subject.java:422)
  163. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
  164. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
  165. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
  166. ... 17 more
  167. Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  168. at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
  169. at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
  170. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
  171. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
  172. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
  173. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
  174. at java.security.AccessController.doPrivileged(Native Method)
  175. at javax.security.auth.Subject.doAs(Subject.java:422)
  176. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
  177. at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734)
  178. ... 17 more
  179. Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
  180. at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
  181. at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
  182. at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
  183. at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
  184. at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
  185. at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
  186. at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
  187. ... 26 more
  188. 17/07/13 20:54:01 INFO HBaseConfigUtil: Closing HBase DB connection
  189. 17/07/13 20:54:08 INFO StdDataConsumerRunnable: Stopping Hbase service app
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement