Advertisement
Guest User

spark-worker.log

a guest
Aug 1st, 2016
307
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 25.86 KB | None | 0 0
  1. 16/08/01 19:31:17 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT]
  2. 16/08/01 19:31:18 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
  3. 16/08/01 19:31:18 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
  4. 16/08/01 19:31:18 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
  5. 16/08/01 19:31:18 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
  6. 16/08/01 19:31:18 DEBUG KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
  7. 16/08/01 19:31:18 DEBUG Groups: Creating new Groups object
  8. 16/08/01 19:31:18 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
  9. 16/08/01 19:31:18 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
  10. 16/08/01 19:31:18 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
  11. 16/08/01 19:31:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  12. 16/08/01 19:31:18 DEBUG JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
  13. 16/08/01 19:31:18 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
  14. 16/08/01 19:31:18 DEBUG Shell: Failed to detect a valid hadoop home directory
  15. java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
  16. at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:265)
  17. at org.apache.hadoop.util.Shell.<clinit>(Shell.java:290)
  18. at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
  19. at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
  20. at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
  21. at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
  22. at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
  23. at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:283)
  24. at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:53)
  25. at org.apache.spark.deploy.SparkHadoopUtil$.hadoop$lzycompute(SparkHadoopUtil.scala:393)
  26. at org.apache.spark.deploy.SparkHadoopUtil$.hadoop(SparkHadoopUtil.scala:393)
  27. at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:413)
  28. at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:151)
  29. at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:253)
  30. at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
  31. 16/08/01 19:31:18 DEBUG Shell: setsid exited with exit code 0
  32. 16/08/01 19:31:18 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
  33. 16/08/01 19:31:18 DEBUG SparkHadoopUtil: running as user: tavo
  34. 16/08/01 19:31:18 DEBUG UserGroupInformation: hadoop login
  35. 16/08/01 19:31:18 DEBUG UserGroupInformation: hadoop login commit
  36. 16/08/01 19:31:18 DEBUG UserGroupInformation: using local user:UnixPrincipal: tavo
  37. 16/08/01 19:31:18 DEBUG UserGroupInformation: UGI loginUser:tavo (auth:SIMPLE)
  38. 16/08/01 19:31:18 DEBUG UserGroupInformation: PrivilegedAction as:tavo (auth:SIMPLE) from:org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
  39. 16/08/01 19:31:18 INFO SecurityManager: Changing view acls to: tavo
  40. 16/08/01 19:31:18 INFO SecurityManager: Changing modify acls to: tavo
  41. 16/08/01 19:31:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tavo); users with modify permissions: Set(tavo)
  42. 16/08/01 19:31:18 DEBUG SSLOptions: No SSL protocol specified
  43. 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
  44. 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
  45. 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
  46. 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
  47. 16/08/01 19:31:19 DEBUG InternalLoggerFactory: Using SLF4J as the default logging framework
  48. 16/08/01 19:31:19 DEBUG PlatformDependent0: java.nio.Buffer.address: available
  49. 16/08/01 19:31:19 DEBUG PlatformDependent0: sun.misc.Unsafe.theUnsafe: available
  50. 16/08/01 19:31:19 DEBUG PlatformDependent0: sun.misc.Unsafe.copyMemory: available
  51. 16/08/01 19:31:19 DEBUG PlatformDependent0: java.nio.Bits.unaligned: true
  52. 16/08/01 19:31:19 DEBUG PlatformDependent: Java version: 8
  53. 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.noUnsafe: false
  54. 16/08/01 19:31:19 DEBUG PlatformDependent: sun.misc.Unsafe: available
  55. 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.noJavassist: false
  56. 16/08/01 19:31:19 DEBUG PlatformDependent: Javassist: unavailable
  57. 16/08/01 19:31:19 DEBUG PlatformDependent: You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes. Please check the configuration for better performance.
  58. 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
  59. 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.bitMode: 64 (sun.arch.data.model)
  60. 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.noPreferDirect: false
  61. 16/08/01 19:31:19 DEBUG MultithreadEventLoopGroup: -Dio.netty.eventLoopThreads: 8
  62. 16/08/01 19:31:19 DEBUG NioEventLoop: -Dio.netty.noKeySetOptimization: false
  63. 16/08/01 19:31:19 DEBUG NioEventLoop: -Dio.netty.selectorAutoRebuildThreshold: 512
  64. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@cb0755b
  65. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@33065d67
  66. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@712625fd
  67. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@7bba5817
  68. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numHeapArenas: 8
  69. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numDirectArenas: 8
  70. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.pageSize: 8192
  71. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxOrder: 11
  72. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.chunkSize: 16777216
  73. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.tinyCacheSize: 512
  74. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.smallCacheSize: 256
  75. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.normalCacheSize: 64
  76. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxCachedBufferCapacity: 32768
  77. 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.cacheTrimInterval: 8192
  78. 16/08/01 19:31:19 DEBUG TransportClientFactory: Creating new connection to /192.168.0.120:35679
  79. 16/08/01 19:31:19 DEBUG ThreadLocalRandom: -Dio.netty.initialSeedUniquifier: 0x3a64bc670ccf97e6 (took 0 ms)
  80. 16/08/01 19:31:19 DEBUG ByteBufUtil: -Dio.netty.allocator.type: unpooled
  81. 16/08/01 19:31:19 DEBUG ByteBufUtil: -Dio.netty.threadLocalDirectBufferSize: 65536
  82. 16/08/01 19:31:19 DEBUG ResourceLeakDetector: -Dio.netty.leakDetectionLevel: simple
  83. 16/08/01 19:31:19 DEBUG TransportClientFactory: Connection to /192.168.0.120:35679 successful, running bootstraps...
  84. 16/08/01 19:31:19 DEBUG TransportClientFactory: Successfully created connection to /192.168.0.120:35679 after 72 ms (0 ms spent in bootstraps)
  85. 16/08/01 19:31:19 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  86. 16/08/01 19:31:19 DEBUG Recycler: -Dio.netty.recycler.maxCapacity.default: 262144
  87. 16/08/01 19:31:19 TRACE TransportClient: Sending request 7409909816768870984 to /192.168.0.120:35679 took 30 ms
  88. 16/08/01 19:31:19 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=7409909816768870984, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
  89. 16/08/01 19:31:19 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  90. 16/08/01 19:31:19 TRACE TransportClient: Sending request 5398020340206074116 to /192.168.0.120:35679 took 2 ms
  91. 16/08/01 19:31:19 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5398020340206074116, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 950, cap: 1024)}}
  92. 16/08/01 19:31:19 INFO SecurityManager: Changing view acls to: tavo
  93. 16/08/01 19:31:19 INFO SecurityManager: Changing modify acls to: tavo
  94. 16/08/01 19:31:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tavo); users with modify permissions: Set(tavo)
  95. 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
  96. 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
  97. 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
  98. 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
  99. 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
  100. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@69ee81fc
  101. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@6e2aa843
  102. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@6f36c2f0
  103. 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@f58853c
  104. 16/08/01 19:31:19 DEBUG AkkaUtils: In createActorSystem, requireCookie is: off
  105. 16/08/01 19:31:19 INFO Slf4jLogger: Slf4jLogger started
  106. 16/08/01 19:31:19 INFO Remoting: Starting remoting
  107. 16/08/01 19:31:20 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutorActorSystem@192.168.0.120:36262]
  108. 16/08/01 19:31:20 INFO Utils: Successfully started service 'sparkExecutorActorSystem' on port 36262.
  109. 16/08/01 19:31:20 DEBUG SparkEnv: Using serializer: class org.apache.spark.serializer.JavaSerializer
  110. 16/08/01 19:31:20 DEBUG TransportClientFactory: Creating new connection to /192.168.0.120:35679
  111. 16/08/01 19:31:20 DEBUG TransportClientFactory: Connection to /192.168.0.120:35679 successful, running bootstraps...
  112. 16/08/01 19:31:20 DEBUG TransportClientFactory: Successfully created connection to /192.168.0.120:35679 after 2 ms (0 ms spent in bootstraps)
  113. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  114. 16/08/01 19:31:20 TRACE TransportClient: Sending request 5051931935896315573 to /192.168.0.120:35679 took 5 ms
  115. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5051931935896315573, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
  116. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  117. 16/08/01 19:31:20 TRACE TransportClient: Sending request 5504939081173971714 to /192.168.0.120:35679 took 4 ms
  118. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5504939081173971714, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
  119. 16/08/01 19:31:20 INFO DiskBlockManager: Created local directory at /tmp/spark-06afb3c9-d442-4943-bd32-9b0605fe2a38/executor-e197bc3c-a45a-42c6-a6d8-acb75c4ce978/blockmgr-9d039577-3706-4672-b94d-88cd97bfb7ad
  120. 16/08/01 19:31:20 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
  121. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  122. 16/08/01 19:31:20 TRACE TransportClient: Sending request 8015204921409192029 to /192.168.0.120:35679 took 5 ms
  123. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=8015204921409192029, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 512)}}
  124. 16/08/01 19:31:20 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@192.168.0.120:35679
  125. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  126. 16/08/01 19:31:20 TRACE TransportClient: Sending request 4615803285815396477 to /192.168.0.120:35679 took 2 ms
  127. 16/08/01 19:31:20 INFO WorkerWatcher: Connecting to worker spark://Worker@192.168.0.120:40635
  128. 16/08/01 19:31:20 DEBUG TransportClientFactory: Creating new connection to /192.168.0.120:40635
  129. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=4615803285815396477, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 512)}}
  130. 16/08/01 19:31:20 DEBUG TransportClientFactory: Connection to /192.168.0.120:40635 successful, running bootstraps...
  131. 16/08/01 19:31:20 DEBUG TransportClientFactory: Successfully created connection to /192.168.0.120:40635 after 3 ms (0 ms spent in bootstraps)
  132. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:40635
  133. 16/08/01 19:31:20 TRACE TransportClient: Sending request 9073159147227726331 to /192.168.0.120:40635 took 5 ms
  134. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  135. 16/08/01 19:31:20 TRACE TransportClient: Sending request 5051327000037567123 to /192.168.0.120:35679 took 3 ms
  136. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5051327000037567123, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 167, cap: 496)}}
  137. 16/08/01 19:31:20 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
  138. 16/08/01 19:31:20 INFO Executor: Starting executor ID 0 on host tip.home
  139. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@4f5a920
  140. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@62fbaa88
  141. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@2d96cb29
  142. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@3df89e4d
  143. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=9073159147227726331, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
  144. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@312cabeb
  145. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@74c86731
  146. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@88f4873
  147. 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@1208441d
  148. 16/08/01 19:31:20 DEBUG NetUtil: Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
  149. 16/08/01 19:31:20 DEBUG NetUtil: /proc/sys/net/core/somaxconn: 128
  150. 16/08/01 19:31:20 DEBUG TransportServer: Shuffle server started on port :35972
  151. 16/08/01 19:31:20 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35972.
  152. 16/08/01 19:31:20 INFO NettyBlockTransferService: Server created on 35972
  153. 16/08/01 19:31:20 INFO BlockManagerMaster: Trying to register BlockManager
  154. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  155. 16/08/01 19:31:20 TRACE TransportClient: Sending request 5545270822933119327 to /192.168.0.120:35679 took 1 ms
  156. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5545270822933119327, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 496)}}
  157. 16/08/01 19:31:20 INFO BlockManagerMaster: Registered BlockManager
  158. 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  159. 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=8252725586650066397, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 480)}}
  160. 16/08/01 19:31:20 TRACE TransportClient: Sending request 8252725586650066397 to /192.168.0.120:35679 took 7 ms
  161. 16/08/01 19:31:37 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
  162. 16/08/01 19:31:37 TRACE TransportClient: Sending request 7329788795494669758 to /192.168.0.120:35679 took 1 ms
  163. 16/08/01 19:31:37 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=7329788795494669758, body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 13, widx: 94, cap: 94, components=2)}}
  164. 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 3720, cap: 3720, components=9)}}
  165. 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 0
  166. 16/08/01 19:31:44 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
  167. 16/08/01 19:31:44 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
  168. java.lang.IllegalStateException: unread block data
  169. at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
  170. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
  171. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
  172. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
  173. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
  174. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
  175. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
  176. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
  177. at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
  178. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
  179. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  180. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  181. at java.lang.Thread.run(Thread.java:745)
  182. 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 3720, cap: 3720, components=4)}}
  183. 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 1
  184. 16/08/01 19:31:44 INFO Executor: Running task 0.1 in stage 0.0 (TID 1)
  185. 16/08/01 19:31:44 ERROR Executor: Exception in task 0.1 in stage 0.0 (TID 1)
  186. java.lang.IllegalStateException: unread block data
  187. at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
  188. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
  189. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
  190. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
  191. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
  192. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
  193. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
  194. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
  195. at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
  196. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
  197. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  198. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  199. at java.lang.Thread.run(Thread.java:745)
  200. 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 3720, cap: 3720, components=2)}}
  201. 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 2
  202. 16/08/01 19:31:44 INFO Executor: Running task 0.2 in stage 0.0 (TID 2)
  203. 16/08/01 19:31:44 ERROR Executor: Exception in task 0.2 in stage 0.0 (TID 2)
  204. java.lang.IllegalStateException: unread block data
  205. at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
  206. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
  207. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
  208. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
  209. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
  210. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
  211. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
  212. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
  213. at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
  214. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
  215. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  216. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  217. at java.lang.Thread.run(Thread.java:745)
  218. 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 13, widx: 3728, cap: 8192)}}
  219. 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 3
  220. 16/08/01 19:31:44 INFO Executor: Running task 0.3 in stage 0.0 (TID 3)
  221. 16/08/01 19:31:44 ERROR Executor: Exception in task 0.3 in stage 0.0 (TID 3)
  222. java.lang.IllegalStateException: unread block data
  223. at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
  224. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
  225. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
  226. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
  227. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
  228. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
  229. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
  230. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
  231. at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
  232. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
  233. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  234. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  235. at java.lang.Thread.run(Thread.java:745)
  236. 16/08/01 19:31:47 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
  237. 16/08/01 19:31:47 TRACE TransportClient: Sending request 7710936371238132193 to tip.home/192.168.0.120:35679 took 1 ms
  238. 16/08/01 19:31:47 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=7710936371238132193, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 8192)}}
  239. 16/08/01 19:31:57 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
  240. 16/08/01 19:31:57 TRACE TransportClient: Sending request 6973944740578439248 to tip.home/192.168.0.120:35679 took 1 ms
  241. 16/08/01 19:31:57 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=6973944740578439248, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 8192)}}
  242. 16/08/01 19:32:07 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
  243. 16/08/01 19:32:07 TRACE TransportClient: Sending request 8343885097453980911 to tip.home/192.168.0.120:35679 took 1 ms
  244. 16/08/01 19:32:07 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=8343885097453980911, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 4096)}}
  245. 16/08/01 19:32:17 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
  246. 16/08/01 19:32:17 TRACE TransportClient: Sending request 5641746415653792370 to tip.home/192.168.0.120:35679 took 1 ms
  247. 16/08/01 19:32:17 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5641746415653792370, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 4096)}}
  248. 16/08/01 19:32:26 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 1099, cap: 1099, components=2)}}
  249. 16/08/01 19:32:26 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown
  250. 16/08/01 19:32:26 INFO MemoryStore: MemoryStore cleared
  251. 16/08/01 19:32:26 INFO BlockManager: BlockManager stopped
  252. 16/08/01 19:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  253. 16/08/01 19:32:26 WARN CoarseGrainedExecutorBackend: An unknown (tip.home:35679) driver disconnected.
  254. 16/08/01 19:32:26 ERROR CoarseGrainedExecutorBackend: Driver 192.168.0.120:35679 disassociated! Shutting down.
  255. 16/08/01 19:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement