Advertisement
Guest User

Untitled

a guest
Jul 27th, 2017
342
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 28.91 KB | None | 0 0
  1. 17/07/27 09:07:51 INFO executor.CoarseGrainedExecutorBackend: Started daemon with process name: 49439@REDACTED_HOST
  2. 17/07/27 09:07:51 INFO util.SignalUtils: Registered signal handler for TERM
  3. 17/07/27 09:07:51 INFO util.SignalUtils: Registered signal handler for HUP
  4. 17/07/27 09:07:51 INFO util.SignalUtils: Registered signal handler for INT
  5. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing view acls to: yarn,REDACTED_USERNAME
  6. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing modify acls to: yarn,REDACTED_USERNAME
  7. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing view acls groups to:
  8. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing modify acls groups to:
  9. 17/07/27 09:07:52 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, REDACTED_USERNAME); groups with view permissions: Set(); users with modify permissions: Set(yarn, REDACTED_USERNAME); groups with modify permissions: Set()
  10. 17/07/27 09:07:52 INFO client.TransportClientFactory: Successfully created connection to /REDACTED_IP:30702 after 92 ms (0 ms spent in bootstraps)
  11. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing view acls to: yarn,REDACTED_USERNAME
  12. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing modify acls to: yarn,REDACTED_USERNAME
  13. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing view acls groups to:
  14. 17/07/27 09:07:52 INFO spark.SecurityManager: Changing modify acls groups to:
  15. 17/07/27 09:07:52 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, REDACTED_USERNAME); groups with view permissions: Set(); users with modify permissions: Set(yarn, REDACTED_USERNAME); groups with modify permissions: Set()
  16. 17/07/27 09:07:52 INFO client.TransportClientFactory: Successfully created connection to /REDACTED_IP:30702 after 2 ms (0 ms spent in bootstraps)
  17. 17/07/27 09:07:52 INFO storage.DiskBlockManager: Created local directory at /data1/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/blockmgr-ebbfd6f0-f722-41f8-972c-492b025ed96c
  18. 17/07/27 09:07:52 INFO storage.DiskBlockManager: Created local directory at /data2/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/blockmgr-a0bc5ea0-76d3-4f8c-807b-783d91a09b40
  19. 17/07/27 09:07:52 INFO storage.DiskBlockManager: Created local directory at /data3/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/blockmgr-3b4ea5b2-53fc-4992-9219-b317c1d6078d
  20. 17/07/27 09:07:52 INFO storage.DiskBlockManager: Created local directory at /data4/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/blockmgr-a0844b71-7c96-42ae-8bca-cdaaabaf6c5d
  21. 17/07/27 09:07:52 INFO storage.DiskBlockManager: Created local directory at /data5/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/blockmgr-fbda8422-ab26-4c85-b7d7-2befe6e9ef99
  22. 17/07/27 09:07:52 INFO storage.DiskBlockManager: Created local directory at /data6/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/blockmgr-176837d5-9716-4a99-89dd-b7f5807fd892
  23. 17/07/27 09:07:52 INFO memory.MemoryStore: MemoryStore started with capacity 3.0 GB
  24. 17/07/27 09:07:53 INFO executor.CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@REDACTED_IP:30702
  25. 17/07/27 09:07:53 INFO executor.CoarseGrainedExecutorBackend: Successfully registered with driver
  26. 17/07/27 09:07:53 INFO executor.Executor: Starting executor ID 1 on host REDACTED_HOST
  27. 17/07/27 09:07:53 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 13424.
  28. 17/07/27 09:07:53 INFO netty.NettyBlockTransferService: Server created on REDACTED_HOST:13424
  29. 17/07/27 09:07:53 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
  30. 17/07/27 09:07:53 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(1, REDACTED_HOST, 13424, None)
  31. 17/07/27 09:07:53 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(1, REDACTED_HOST, 13424, None)
  32. 17/07/27 09:07:53 INFO storage.BlockManager: external shuffle service port = 7337
  33. 17/07/27 09:07:53 INFO storage.BlockManager: Registering executor with local external shuffle service.
  34. 17/07/27 09:07:53 INFO client.TransportClientFactory: Successfully created connection to REDACTED_HOST/REDACTED_IP:7337 after 2 ms (0 ms spent in bootstraps)
  35. 17/07/27 09:07:53 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(1, REDACTED_HOST, 13424, None)
  36. 17/07/27 09:07:53 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 0
  37. 17/07/27 09:07:53 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 1
  38. 17/07/27 09:07:53 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 2
  39. 17/07/27 09:07:53 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 3
  40. 17/07/27 09:07:53 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
  41. 17/07/27 09:07:53 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)
  42. 17/07/27 09:07:53 INFO executor.Executor: Running task 2.0 in stage 0.0 (TID 2)
  43. 17/07/27 09:07:53 INFO executor.Executor: Running task 3.0 in stage 0.0 (TID 3)
  44. 17/07/27 09:07:53 INFO executor.Executor: Fetching spark://REDACTED_IP:30702/jars/REDACTED_JAR with timestamp 1501142854454
  45. 17/07/27 09:07:53 INFO client.TransportClientFactory: Successfully created connection to /REDACTED_IP:30702 after 2 ms (0 ms spent in bootstraps)
  46. 17/07/27 09:07:53 INFO util.Utils: Fetching spark://REDACTED_IP:30702/jars/REDACTED_JAR to /data1/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/spark-b018e92f-3800-4882-9dd8-b4315f119932/fetchFileTemp3129117840396051325.tmp
  47. 17/07/27 09:07:53 INFO util.Utils: Copying /data1/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/spark-b018e92f-3800-4882-9dd8-b4315f119932/-3463856231501142854454_cache to /data2/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/container_1500467023260_0062_01_000002/./REDACTED_JAR
  48. 17/07/27 09:07:54 INFO executor.Executor: Adding file:/data2/yarn/nm/usercache/REDACTED_USERNAME/appcache/application_1500467023260_0062/container_1500467023260_0062_01_000002/./REDACTED_JAR to class loader
  49. 17/07/27 09:07:54 INFO broadcast.TorrentBroadcast: Started reading broadcast variable 0
  50. 17/07/27 09:07:54 INFO client.TransportClientFactory: Successfully created connection to /REDACTED_IP:21339 after 1 ms (0 ms spent in bootstraps)
  51. 17/07/27 09:07:54 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 16.7 KB, free 3.0 GB)
  52. 17/07/27 09:07:54 INFO broadcast.TorrentBroadcast: Reading broadcast variable 0 took 148 ms
  53. 17/07/27 09:07:54 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 49.3 KB, free 3.0 GB)
  54. 17/07/27 09:07:54 INFO consumer.ConsumerConfig: ConsumerConfig values:
  55. interceptor.classes = null
  56. request.timeout.ms = 40000
  57. check.crcs = true
  58. ssl.truststore.password = null
  59. retry.backoff.ms = 100
  60. ssl.keymanager.algorithm = SunX509
  61. receive.buffer.bytes = 65536
  62. ssl.key.password = null
  63. ssl.cipher.suites = null
  64. ssl.secure.random.implementation = null
  65. sasl.kerberos.ticket.renew.jitter = 0.05
  66. sasl.kerberos.service.name = null
  67. ssl.provider = null
  68. session.timeout.ms = 30000
  69. sasl.kerberos.ticket.renew.window.factor = 0.8
  70. sasl.mechanism = GSSAPI
  71. max.poll.records = 2147483647
  72. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  73. client.id =
  74. fetch.max.wait.ms = 500
  75. fetch.min.bytes = 1
  76. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  77. auto.offset.reset = none
  78. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  79. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  80. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  81. max.partition.fetch.bytes = 1048576
  82. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  83. ssl.endpoint.identification.algorithm = null
  84. ssl.keystore.location = null
  85. ssl.truststore.location = null
  86. exclude.internal.topics = true
  87. ssl.keystore.password = null
  88. metrics.sample.window.ms = 30000
  89. security.protocol = PLAINTEXT
  90. metadata.max.age.ms = 300000
  91. auto.commit.interval.ms = 5000
  92. ssl.protocol = TLS
  93. sasl.kerberos.min.time.before.relogin = 60000
  94. connections.max.idle.ms = 540000
  95. ssl.trustmanager.algorithm = PKIX
  96. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  97. enable.auto.commit = false
  98. metric.reporters = []
  99. ssl.truststore.type = JKS
  100. send.buffer.bytes = 131072
  101. reconnect.backoff.ms = 50
  102. metrics.num.samples = 2
  103. ssl.keystore.type = JKS
  104. heartbeat.interval.ms = 3000
  105.  
  106. 17/07/27 09:07:54 INFO consumer.ConsumerConfig: ConsumerConfig values:
  107. interceptor.classes = null
  108. request.timeout.ms = 40000
  109. check.crcs = true
  110. ssl.truststore.password = null
  111. retry.backoff.ms = 100
  112. ssl.keymanager.algorithm = SunX509
  113. receive.buffer.bytes = 65536
  114. ssl.key.password = null
  115. ssl.cipher.suites = null
  116. ssl.secure.random.implementation = null
  117. sasl.kerberos.ticket.renew.jitter = 0.05
  118. sasl.kerberos.service.name = null
  119. ssl.provider = null
  120. session.timeout.ms = 30000
  121. sasl.kerberos.ticket.renew.window.factor = 0.8
  122. sasl.mechanism = GSSAPI
  123. max.poll.records = 2147483647
  124. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  125. client.id = consumer-1
  126. fetch.max.wait.ms = 500
  127. fetch.min.bytes = 1
  128. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  129. auto.offset.reset = none
  130. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  131. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  132. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  133. max.partition.fetch.bytes = 1048576
  134. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  135. ssl.endpoint.identification.algorithm = null
  136. ssl.keystore.location = null
  137. ssl.truststore.location = null
  138. exclude.internal.topics = true
  139. ssl.keystore.password = null
  140. metrics.sample.window.ms = 30000
  141. security.protocol = PLAINTEXT
  142. metadata.max.age.ms = 300000
  143. auto.commit.interval.ms = 5000
  144. ssl.protocol = TLS
  145. sasl.kerberos.min.time.before.relogin = 60000
  146. connections.max.idle.ms = 540000
  147. ssl.trustmanager.algorithm = PKIX
  148. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  149. enable.auto.commit = false
  150. metric.reporters = []
  151. ssl.truststore.type = JKS
  152. send.buffer.bytes = 131072
  153. reconnect.backoff.ms = 50
  154. metrics.num.samples = 2
  155. ssl.keystore.type = JKS
  156. heartbeat.interval.ms = 3000
  157.  
  158. 17/07/27 09:07:54 INFO utils.AppInfoParser: Kafka version : 0.10.0-kafka-2.1.0
  159. 17/07/27 09:07:54 INFO utils.AppInfoParser: Kafka commitId : unknown
  160. 17/07/27 09:07:54 INFO consumer.ConsumerConfig: ConsumerConfig values:
  161. interceptor.classes = null
  162. request.timeout.ms = 40000
  163. check.crcs = true
  164. ssl.truststore.password = null
  165. retry.backoff.ms = 100
  166. ssl.keymanager.algorithm = SunX509
  167. receive.buffer.bytes = 65536
  168. ssl.key.password = null
  169. ssl.cipher.suites = null
  170. ssl.secure.random.implementation = null
  171. sasl.kerberos.ticket.renew.jitter = 0.05
  172. sasl.kerberos.service.name = null
  173. ssl.provider = null
  174. session.timeout.ms = 30000
  175. sasl.kerberos.ticket.renew.window.factor = 0.8
  176. sasl.mechanism = GSSAPI
  177. max.poll.records = 2147483647
  178. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  179. client.id =
  180. fetch.max.wait.ms = 500
  181. fetch.min.bytes = 1
  182. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  183. auto.offset.reset = none
  184. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  185. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  186. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  187. max.partition.fetch.bytes = 1048576
  188. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  189. ssl.endpoint.identification.algorithm = null
  190. ssl.keystore.location = null
  191. ssl.truststore.location = null
  192. exclude.internal.topics = true
  193. ssl.keystore.password = null
  194. metrics.sample.window.ms = 30000
  195. security.protocol = PLAINTEXT
  196. metadata.max.age.ms = 300000
  197. auto.commit.interval.ms = 5000
  198. ssl.protocol = TLS
  199. sasl.kerberos.min.time.before.relogin = 60000
  200. connections.max.idle.ms = 540000
  201. ssl.trustmanager.algorithm = PKIX
  202. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  203. enable.auto.commit = false
  204. metric.reporters = []
  205. ssl.truststore.type = JKS
  206. send.buffer.bytes = 131072
  207. reconnect.backoff.ms = 50
  208. metrics.num.samples = 2
  209. ssl.keystore.type = JKS
  210. heartbeat.interval.ms = 3000
  211.  
  212. 17/07/27 09:07:54 INFO consumer.ConsumerConfig: ConsumerConfig values:
  213. interceptor.classes = null
  214. request.timeout.ms = 40000
  215. check.crcs = true
  216. ssl.truststore.password = null
  217. retry.backoff.ms = 100
  218. ssl.keymanager.algorithm = SunX509
  219. receive.buffer.bytes = 65536
  220. ssl.key.password = null
  221. ssl.cipher.suites = null
  222. ssl.secure.random.implementation = null
  223. sasl.kerberos.ticket.renew.jitter = 0.05
  224. sasl.kerberos.service.name = null
  225. ssl.provider = null
  226. session.timeout.ms = 30000
  227. sasl.kerberos.ticket.renew.window.factor = 0.8
  228. sasl.mechanism = GSSAPI
  229. max.poll.records = 2147483647
  230. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  231. client.id = consumer-2
  232. fetch.max.wait.ms = 500
  233. fetch.min.bytes = 1
  234. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  235. auto.offset.reset = none
  236. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  237. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  238. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  239. max.partition.fetch.bytes = 1048576
  240. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  241. ssl.endpoint.identification.algorithm = null
  242. ssl.keystore.location = null
  243. ssl.truststore.location = null
  244. exclude.internal.topics = true
  245. ssl.keystore.password = null
  246. metrics.sample.window.ms = 30000
  247. security.protocol = PLAINTEXT
  248. metadata.max.age.ms = 300000
  249. auto.commit.interval.ms = 5000
  250. ssl.protocol = TLS
  251. sasl.kerberos.min.time.before.relogin = 60000
  252. connections.max.idle.ms = 540000
  253. ssl.trustmanager.algorithm = PKIX
  254. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  255. enable.auto.commit = false
  256. metric.reporters = []
  257. ssl.truststore.type = JKS
  258. send.buffer.bytes = 131072
  259. reconnect.backoff.ms = 50
  260. metrics.num.samples = 2
  261. ssl.keystore.type = JKS
  262. heartbeat.interval.ms = 3000
  263.  
  264. 17/07/27 09:07:55 INFO utils.AppInfoParser: Kafka version : 0.10.0-kafka-2.1.0
  265. 17/07/27 09:07:55 INFO utils.AppInfoParser: Kafka commitId : unknown
  266. 17/07/27 09:07:55 INFO consumer.ConsumerConfig: ConsumerConfig values:
  267. interceptor.classes = null
  268. request.timeout.ms = 40000
  269. check.crcs = true
  270. ssl.truststore.password = null
  271. retry.backoff.ms = 100
  272. ssl.keymanager.algorithm = SunX509
  273. receive.buffer.bytes = 65536
  274. ssl.key.password = null
  275. ssl.cipher.suites = null
  276. ssl.secure.random.implementation = null
  277. sasl.kerberos.ticket.renew.jitter = 0.05
  278. sasl.kerberos.service.name = null
  279. ssl.provider = null
  280. session.timeout.ms = 30000
  281. sasl.kerberos.ticket.renew.window.factor = 0.8
  282. sasl.mechanism = GSSAPI
  283. max.poll.records = 2147483647
  284. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  285. client.id =
  286. fetch.max.wait.ms = 500
  287. fetch.min.bytes = 1
  288. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  289. auto.offset.reset = none
  290. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  291. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  292. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  293. max.partition.fetch.bytes = 1048576
  294. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  295. ssl.endpoint.identification.algorithm = null
  296. ssl.keystore.location = null
  297. ssl.truststore.location = null
  298. exclude.internal.topics = true
  299. ssl.keystore.password = null
  300. metrics.sample.window.ms = 30000
  301. security.protocol = PLAINTEXT
  302. metadata.max.age.ms = 300000
  303. auto.commit.interval.ms = 5000
  304. ssl.protocol = TLS
  305. sasl.kerberos.min.time.before.relogin = 60000
  306. connections.max.idle.ms = 540000
  307. ssl.trustmanager.algorithm = PKIX
  308. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  309. enable.auto.commit = false
  310. metric.reporters = []
  311. ssl.truststore.type = JKS
  312. send.buffer.bytes = 131072
  313. reconnect.backoff.ms = 50
  314. metrics.num.samples = 2
  315. ssl.keystore.type = JKS
  316. heartbeat.interval.ms = 3000
  317.  
  318. 17/07/27 09:07:55 INFO consumer.ConsumerConfig: ConsumerConfig values:
  319. interceptor.classes = null
  320. request.timeout.ms = 40000
  321. check.crcs = true
  322. ssl.truststore.password = null
  323. retry.backoff.ms = 100
  324. ssl.keymanager.algorithm = SunX509
  325. receive.buffer.bytes = 65536
  326. ssl.key.password = null
  327. ssl.cipher.suites = null
  328. ssl.secure.random.implementation = null
  329. sasl.kerberos.ticket.renew.jitter = 0.05
  330. sasl.kerberos.service.name = null
  331. ssl.provider = null
  332. session.timeout.ms = 30000
  333. sasl.kerberos.ticket.renew.window.factor = 0.8
  334. sasl.mechanism = GSSAPI
  335. max.poll.records = 2147483647
  336. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  337. client.id = consumer-3
  338. fetch.max.wait.ms = 500
  339. fetch.min.bytes = 1
  340. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  341. auto.offset.reset = none
  342. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  343. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  344. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  345. max.partition.fetch.bytes = 1048576
  346. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  347. ssl.endpoint.identification.algorithm = null
  348. ssl.keystore.location = null
  349. ssl.truststore.location = null
  350. exclude.internal.topics = true
  351. ssl.keystore.password = null
  352. metrics.sample.window.ms = 30000
  353. security.protocol = PLAINTEXT
  354. metadata.max.age.ms = 300000
  355. auto.commit.interval.ms = 5000
  356. ssl.protocol = TLS
  357. sasl.kerberos.min.time.before.relogin = 60000
  358. connections.max.idle.ms = 540000
  359. ssl.trustmanager.algorithm = PKIX
  360. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  361. enable.auto.commit = false
  362. metric.reporters = []
  363. ssl.truststore.type = JKS
  364. send.buffer.bytes = 131072
  365. reconnect.backoff.ms = 50
  366. metrics.num.samples = 2
  367. ssl.keystore.type = JKS
  368. heartbeat.interval.ms = 3000
  369.  
  370. 17/07/27 09:07:55 INFO utils.AppInfoParser: Kafka version : 0.10.0-kafka-2.1.0
  371. 17/07/27 09:07:55 INFO utils.AppInfoParser: Kafka commitId : unknown
  372. 17/07/27 09:07:55 INFO consumer.ConsumerConfig: ConsumerConfig values:
  373. interceptor.classes = null
  374. request.timeout.ms = 40000
  375. check.crcs = true
  376. ssl.truststore.password = null
  377. retry.backoff.ms = 100
  378. ssl.keymanager.algorithm = SunX509
  379. receive.buffer.bytes = 65536
  380. ssl.key.password = null
  381. ssl.cipher.suites = null
  382. ssl.secure.random.implementation = null
  383. sasl.kerberos.ticket.renew.jitter = 0.05
  384. sasl.kerberos.service.name = null
  385. ssl.provider = null
  386. session.timeout.ms = 30000
  387. sasl.kerberos.ticket.renew.window.factor = 0.8
  388. sasl.mechanism = GSSAPI
  389. max.poll.records = 2147483647
  390. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  391. client.id =
  392. fetch.max.wait.ms = 500
  393. fetch.min.bytes = 1
  394. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  395. auto.offset.reset = none
  396. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  397. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  398. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  399. max.partition.fetch.bytes = 1048576
  400. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  401. ssl.endpoint.identification.algorithm = null
  402. ssl.keystore.location = null
  403. ssl.truststore.location = null
  404. exclude.internal.topics = true
  405. ssl.keystore.password = null
  406. metrics.sample.window.ms = 30000
  407. security.protocol = PLAINTEXT
  408. metadata.max.age.ms = 300000
  409. auto.commit.interval.ms = 5000
  410. ssl.protocol = TLS
  411. sasl.kerberos.min.time.before.relogin = 60000
  412. connections.max.idle.ms = 540000
  413. ssl.trustmanager.algorithm = PKIX
  414. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  415. enable.auto.commit = false
  416. metric.reporters = []
  417. ssl.truststore.type = JKS
  418. send.buffer.bytes = 131072
  419. reconnect.backoff.ms = 50
  420. metrics.num.samples = 2
  421. ssl.keystore.type = JKS
  422. heartbeat.interval.ms = 3000
  423.  
  424. 17/07/27 09:07:55 INFO consumer.ConsumerConfig: ConsumerConfig values:
  425. interceptor.classes = null
  426. request.timeout.ms = 40000
  427. check.crcs = true
  428. ssl.truststore.password = null
  429. retry.backoff.ms = 100
  430. ssl.keymanager.algorithm = SunX509
  431. receive.buffer.bytes = 65536
  432. ssl.key.password = null
  433. ssl.cipher.suites = null
  434. ssl.secure.random.implementation = null
  435. sasl.kerberos.ticket.renew.jitter = 0.05
  436. sasl.kerberos.service.name = null
  437. ssl.provider = null
  438. session.timeout.ms = 30000
  439. sasl.kerberos.ticket.renew.window.factor = 0.8
  440. sasl.mechanism = GSSAPI
  441. max.poll.records = 2147483647
  442. bootstrap.servers = [REDACTED_HOST:9092, gbslixaacspa05u.metis.prd:9092]
  443. client.id = consumer-4
  444. fetch.max.wait.ms = 500
  445. fetch.min.bytes = 1
  446. key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  447. auto.offset.reset = none
  448. value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
  449. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  450. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  451. max.partition.fetch.bytes = 1048576
  452. partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
  453. ssl.endpoint.identification.algorithm = null
  454. ssl.keystore.location = null
  455. ssl.truststore.location = null
  456. exclude.internal.topics = true
  457. ssl.keystore.password = null
  458. metrics.sample.window.ms = 30000
  459. security.protocol = PLAINTEXT
  460. metadata.max.age.ms = 300000
  461. auto.commit.interval.ms = 5000
  462. ssl.protocol = TLS
  463. sasl.kerberos.min.time.before.relogin = 60000
  464. connections.max.idle.ms = 540000
  465. ssl.trustmanager.algorithm = PKIX
  466. group.id = spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor
  467. enable.auto.commit = false
  468. metric.reporters = []
  469. ssl.truststore.type = JKS
  470. send.buffer.bytes = 131072
  471. reconnect.backoff.ms = 50
  472. metrics.num.samples = 2
  473. ssl.keystore.type = JKS
  474. heartbeat.interval.ms = 3000
  475.  
  476. 17/07/27 09:07:55 INFO utils.AppInfoParser: Kafka version : 0.10.0-kafka-2.1.0
  477. 17/07/27 09:07:55 INFO utils.AppInfoParser: Kafka commitId : unknown
  478. 17/07/27 09:07:55 INFO codegen.CodeGenerator: Code generated in 296.12482 ms
  479. 17/07/27 09:07:55 INFO codegen.CodeGenerator: Code generated in 63.861574 ms
  480. 17/07/27 09:07:55 INFO codegen.CodeGenerator: Code generated in 20.656328 ms
  481. 17/07/27 09:07:55 INFO codegen.CodeGenerator: Code generated in 101.843724 ms
  482. 17/07/27 09:07:55 INFO codegen.CodeGenerator: Code generated in 21.447822 ms
  483. 17/07/27 09:07:55 INFO internals.AbstractCoordinator: Discovered coordinator gbslixaacspa05u.metis.prd:9092 (id: 2147483384 rack: null) for group spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor.
  484. 17/07/27 09:07:55 INFO internals.AbstractCoordinator: Discovered coordinator gbslixaacspa05u.metis.prd:9092 (id: 2147483384 rack: null) for group spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor.
  485. 17/07/27 09:07:55 INFO internals.AbstractCoordinator: Discovered coordinator gbslixaacspa05u.metis.prd:9092 (id: 2147483384 rack: null) for group spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor.
  486. 17/07/27 09:07:55 INFO internals.AbstractCoordinator: Discovered coordinator gbslixaacspa05u.metis.prd:9092 (id: 2147483384 rack: null) for group spark-kafka-source-7209feda-3255-4ad4-b6f8-f2f65220d3c3--1396947302-executor.
  487. 17/07/27 09:07:55 INFO parser.CatalystSqlParser: Parsing command: string
  488. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: string
  489. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: timestamp
  490. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: string
  491. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: string
  492. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: string
  493. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: string
  494. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: string
  495. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: timestamp
  496. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: timestamp
  497. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: double
  498. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: double
  499. 17/07/27 09:07:56 INFO parser.CatalystSqlParser: Parsing command: string
  500. 17/07/27 09:07:56 INFO codegen.CodeGenerator: Code generated in 21.216233 ms
  501. 17/07/27 09:07:56 INFO producer.ProducerConfig: ProducerConfig values:
  502. interceptor.classes = null
  503. request.timeout.ms = 30000
  504. buffer.memory = 33554432
  505. ssl.keymanager.algorithm = SunX509
  506. ssl.cipher.suites = null
  507. ssl.key.password = null
  508. sasl.kerberos.ticket.renew.jitter = 0.05
  509. ssl.provider = null
  510. sasl.kerberos.service.name = null
  511. max.in.flight.requests.per.connection = 5
  512. bootstrap.servers = [gbslixaacspa04u:9092, gbslixaacspa05u:9092]
  513. client.id = etlpipeline
  514. max.request.size = 1048576
  515. linger.ms = 0
  516. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  517. ssl.endpoint.identification.algorithm = null
  518. value.serializer = class org.apache.kafka.common.serialization.StringSerializer
  519. ssl.keystore.password = null
  520. key.serializer = class org.apache.kafka.common.serialization.StringSerializer
  521. ssl.protocol = TLS
  522. sasl.kerberos.min.time.before.relogin = 60000
  523. connections.max.idle.ms = 540000
  524. ssl.trustmanager.algorithm = PKIX
  525. max.block.ms = 60000
  526. send.buffer.bytes = 131072
  527. partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
  528. reconnect.backoff.ms = 50
  529. metrics.num.samples = 2
  530. retry.backoff.ms = 100
  531. ssl.truststore.password = null
  532. batch.size = 16384
  533. receive.buffer.bytes = 32768
  534. ssl.secure.random.implementation = null
  535. sasl.mechanism = GSSAPI
  536. sasl.kerberos.ticket.renew.window.factor = 0.8
  537. acks = 1
  538. metadata.fetch.timeout.ms = 60000
  539. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  540. ssl.keystore.location = null
  541. ssl.truststore.location = null
  542. block.on.buffer.full = false
  543. metrics.sample.window.ms = 30000
  544. metadata.max.age.ms = 300000
  545. security.protocol = PLAINTEXT
  546. timeout.ms = 30000
  547. metric.reporters = []
  548. compression.type = none
  549. ssl.truststore.type = JKS
  550. retries = 0
  551. ssl.keystore.type = JKS
  552.  
  553. 17/07/27 09:07:56 INFO producer.ProducerConfig: ProducerConfig values:
  554. interceptor.classes = null
  555. request.timeout.ms = 30000
  556. buffer.memory = 33554432
  557. ssl.keymanager.algorithm = SunX509
  558. ssl.cipher.suites = null
  559. ssl.key.password = null
  560. sasl.kerberos.ticket.renew.jitter = 0.05
  561. ssl.provider = null
  562. sasl.kerberos.service.name = null
  563. max.in.flight.requests.per.connection = 5
  564. bootstrap.servers = [gbslixaacspa04u:9092, gbslixaacspa05u:9092]
  565. client.id = etlpipeline
  566. max.request.size = 1048576
  567. linger.ms = 0
  568. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  569. ssl.endpoint.identification.algorithm = null
  570. value.serializer = class org.apache.kafka.common.serialization.StringSerializer
  571. ssl.keystore.password = null
  572. key.serializer = class org.apache.kafka.common.serialization.StringSerializer
  573. ssl.protocol = TLS
  574. sasl.kerberos.min.time.before.relogin = 60000
  575. connections.max.idle.ms = 540000
  576. ssl.trustmanager.algorithm = PKIX
  577. max.block.ms = 60000
  578. send.buffer.bytes = 131072
  579. partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
  580. reconnect.backoff.ms = 50
  581. metrics.num.samples = 2
  582. retry.backoff.ms = 100
  583. ssl.truststore.password = null
  584. batch.size = 16384
  585. receive.buffer.bytes = 32768
  586. ssl.secure.random.implementation = null
  587. sasl.mechanism = GSSAPI
  588. sasl.kerberos.ticket.renew.window.factor = 0.8
  589. acks = 1
  590. metadata.fetch.timeout.ms = 60000
  591. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  592. ssl.keystore.location = null
  593. ssl.truststore.location = null
  594. block.on.buffer.full = false
  595. metrics.sample.window.ms = 30000
  596. metadata.max.age.ms = 300000
  597. security.protocol = PLAINTEXT
  598. timeout.ms = 30000
  599. metric.reporters = []
  600. compression.type = none
  601. ssl.truststore.type = JKS
  602. retries = 0
  603. ssl.keystore.type = JKS
  604.  
  605. 17/07/27 09:07:56 INFO utils.AppInfoParser: Kafka version : 0.10.0-kafka-2.1.0
  606. 17/07/27 09:07:56 INFO utils.AppInfoParser: Kafka commitId : unknown
  607. 17/07/27 09:08:29 INFO executor.Executor: Finished task 1.0 in stage 0.0 (TID 1). 1563 bytes result sent to driver
  608. 17/07/27 09:08:30 INFO executor.Executor: Finished task 2.0 in stage 0.0 (TID 2). 1476 bytes result sent to driver
  609. 17/07/27 09:08:32 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 2275 bytes result sent to driver
  610. 17/07/27 09:08:33 INFO executor.Executor: Finished task 3.0 in stage 0.0 (TID 3). 1476 bytes result sent to driver
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement