Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 16/08/01 19:31:17 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT]
- 16/08/01 19:31:18 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
- 16/08/01 19:31:18 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
- 16/08/01 19:31:18 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
- 16/08/01 19:31:18 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
- 16/08/01 19:31:18 DEBUG KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
- 16/08/01 19:31:18 DEBUG Groups: Creating new Groups object
- 16/08/01 19:31:18 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
- 16/08/01 19:31:18 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
- 16/08/01 19:31:18 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
- 16/08/01 19:31:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 16/08/01 19:31:18 DEBUG JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
- 16/08/01 19:31:18 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
- 16/08/01 19:31:18 DEBUG Shell: Failed to detect a valid hadoop home directory
- java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
- at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:265)
- at org.apache.hadoop.util.Shell.<clinit>(Shell.java:290)
- at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
- at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
- at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
- at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
- at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
- at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:283)
- at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:53)
- at org.apache.spark.deploy.SparkHadoopUtil$.hadoop$lzycompute(SparkHadoopUtil.scala:393)
- at org.apache.spark.deploy.SparkHadoopUtil$.hadoop(SparkHadoopUtil.scala:393)
- at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:413)
- at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:151)
- at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:253)
- at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
- 16/08/01 19:31:18 DEBUG Shell: setsid exited with exit code 0
- 16/08/01 19:31:18 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
- 16/08/01 19:31:18 DEBUG SparkHadoopUtil: running as user: tavo
- 16/08/01 19:31:18 DEBUG UserGroupInformation: hadoop login
- 16/08/01 19:31:18 DEBUG UserGroupInformation: hadoop login commit
- 16/08/01 19:31:18 DEBUG UserGroupInformation: using local user:UnixPrincipal: tavo
- 16/08/01 19:31:18 DEBUG UserGroupInformation: UGI loginUser:tavo (auth:SIMPLE)
- 16/08/01 19:31:18 DEBUG UserGroupInformation: PrivilegedAction as:tavo (auth:SIMPLE) from:org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
- 16/08/01 19:31:18 INFO SecurityManager: Changing view acls to: tavo
- 16/08/01 19:31:18 INFO SecurityManager: Changing modify acls to: tavo
- 16/08/01 19:31:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tavo); users with modify permissions: Set(tavo)
- 16/08/01 19:31:18 DEBUG SSLOptions: No SSL protocol specified
- 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
- 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
- 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
- 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
- 16/08/01 19:31:19 DEBUG InternalLoggerFactory: Using SLF4J as the default logging framework
- 16/08/01 19:31:19 DEBUG PlatformDependent0: java.nio.Buffer.address: available
- 16/08/01 19:31:19 DEBUG PlatformDependent0: sun.misc.Unsafe.theUnsafe: available
- 16/08/01 19:31:19 DEBUG PlatformDependent0: sun.misc.Unsafe.copyMemory: available
- 16/08/01 19:31:19 DEBUG PlatformDependent0: java.nio.Bits.unaligned: true
- 16/08/01 19:31:19 DEBUG PlatformDependent: Java version: 8
- 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.noUnsafe: false
- 16/08/01 19:31:19 DEBUG PlatformDependent: sun.misc.Unsafe: available
- 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.noJavassist: false
- 16/08/01 19:31:19 DEBUG PlatformDependent: Javassist: unavailable
- 16/08/01 19:31:19 DEBUG PlatformDependent: You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes. Please check the configuration for better performance.
- 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
- 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.bitMode: 64 (sun.arch.data.model)
- 16/08/01 19:31:19 DEBUG PlatformDependent: -Dio.netty.noPreferDirect: false
- 16/08/01 19:31:19 DEBUG MultithreadEventLoopGroup: -Dio.netty.eventLoopThreads: 8
- 16/08/01 19:31:19 DEBUG NioEventLoop: -Dio.netty.noKeySetOptimization: false
- 16/08/01 19:31:19 DEBUG NioEventLoop: -Dio.netty.selectorAutoRebuildThreshold: 512
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@cb0755b
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@33065d67
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@712625fd
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@7bba5817
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numHeapArenas: 8
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numDirectArenas: 8
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.pageSize: 8192
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxOrder: 11
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.chunkSize: 16777216
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.tinyCacheSize: 512
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.smallCacheSize: 256
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.normalCacheSize: 64
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxCachedBufferCapacity: 32768
- 16/08/01 19:31:19 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.cacheTrimInterval: 8192
- 16/08/01 19:31:19 DEBUG TransportClientFactory: Creating new connection to /192.168.0.120:35679
- 16/08/01 19:31:19 DEBUG ThreadLocalRandom: -Dio.netty.initialSeedUniquifier: 0x3a64bc670ccf97e6 (took 0 ms)
- 16/08/01 19:31:19 DEBUG ByteBufUtil: -Dio.netty.allocator.type: unpooled
- 16/08/01 19:31:19 DEBUG ByteBufUtil: -Dio.netty.threadLocalDirectBufferSize: 65536
- 16/08/01 19:31:19 DEBUG ResourceLeakDetector: -Dio.netty.leakDetectionLevel: simple
- 16/08/01 19:31:19 DEBUG TransportClientFactory: Connection to /192.168.0.120:35679 successful, running bootstraps...
- 16/08/01 19:31:19 DEBUG TransportClientFactory: Successfully created connection to /192.168.0.120:35679 after 72 ms (0 ms spent in bootstraps)
- 16/08/01 19:31:19 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:19 DEBUG Recycler: -Dio.netty.recycler.maxCapacity.default: 262144
- 16/08/01 19:31:19 TRACE TransportClient: Sending request 7409909816768870984 to /192.168.0.120:35679 took 30 ms
- 16/08/01 19:31:19 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=7409909816768870984, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
- 16/08/01 19:31:19 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:19 TRACE TransportClient: Sending request 5398020340206074116 to /192.168.0.120:35679 took 2 ms
- 16/08/01 19:31:19 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5398020340206074116, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 950, cap: 1024)}}
- 16/08/01 19:31:19 INFO SecurityManager: Changing view acls to: tavo
- 16/08/01 19:31:19 INFO SecurityManager: Changing modify acls to: tavo
- 16/08/01 19:31:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tavo); users with modify permissions: Set(tavo)
- 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
- 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
- 16/08/01 19:31:19 DEBUG SSLOptions: No SSL protocol specified
- 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
- 16/08/01 19:31:19 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@69ee81fc
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@6e2aa843
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@6f36c2f0
- 16/08/01 19:31:19 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@f58853c
- 16/08/01 19:31:19 DEBUG AkkaUtils: In createActorSystem, requireCookie is: off
- 16/08/01 19:31:19 INFO Slf4jLogger: Slf4jLogger started
- 16/08/01 19:31:19 INFO Remoting: Starting remoting
- 16/08/01 19:31:20 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutorActorSystem@192.168.0.120:36262]
- 16/08/01 19:31:20 INFO Utils: Successfully started service 'sparkExecutorActorSystem' on port 36262.
- 16/08/01 19:31:20 DEBUG SparkEnv: Using serializer: class org.apache.spark.serializer.JavaSerializer
- 16/08/01 19:31:20 DEBUG TransportClientFactory: Creating new connection to /192.168.0.120:35679
- 16/08/01 19:31:20 DEBUG TransportClientFactory: Connection to /192.168.0.120:35679 successful, running bootstraps...
- 16/08/01 19:31:20 DEBUG TransportClientFactory: Successfully created connection to /192.168.0.120:35679 after 2 ms (0 ms spent in bootstraps)
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 5051931935896315573 to /192.168.0.120:35679 took 5 ms
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5051931935896315573, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 5504939081173971714 to /192.168.0.120:35679 took 4 ms
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5504939081173971714, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
- 16/08/01 19:31:20 INFO DiskBlockManager: Created local directory at /tmp/spark-06afb3c9-d442-4943-bd32-9b0605fe2a38/executor-e197bc3c-a45a-42c6-a6d8-acb75c4ce978/blockmgr-9d039577-3706-4672-b94d-88cd97bfb7ad
- 16/08/01 19:31:20 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 8015204921409192029 to /192.168.0.120:35679 took 5 ms
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=8015204921409192029, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 512)}}
- 16/08/01 19:31:20 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@192.168.0.120:35679
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 4615803285815396477 to /192.168.0.120:35679 took 2 ms
- 16/08/01 19:31:20 INFO WorkerWatcher: Connecting to worker spark://Worker@192.168.0.120:40635
- 16/08/01 19:31:20 DEBUG TransportClientFactory: Creating new connection to /192.168.0.120:40635
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=4615803285815396477, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 512)}}
- 16/08/01 19:31:20 DEBUG TransportClientFactory: Connection to /192.168.0.120:40635 successful, running bootstraps...
- 16/08/01 19:31:20 DEBUG TransportClientFactory: Successfully created connection to /192.168.0.120:40635 after 3 ms (0 ms spent in bootstraps)
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:40635
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 9073159147227726331 to /192.168.0.120:40635 took 5 ms
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 5051327000037567123 to /192.168.0.120:35679 took 3 ms
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5051327000037567123, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 167, cap: 496)}}
- 16/08/01 19:31:20 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
- 16/08/01 19:31:20 INFO Executor: Starting executor ID 0 on host tip.home
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@4f5a920
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@62fbaa88
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@2d96cb29
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@3df89e4d
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=9073159147227726331, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 1024)}}
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@312cabeb
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@74c86731
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@88f4873
- 16/08/01 19:31:20 TRACE NioEventLoop: Instrumented an optimized java.util.Set into: sun.nio.ch.EPollSelectorImpl@1208441d
- 16/08/01 19:31:20 DEBUG NetUtil: Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
- 16/08/01 19:31:20 DEBUG NetUtil: /proc/sys/net/core/somaxconn: 128
- 16/08/01 19:31:20 DEBUG TransportServer: Shuffle server started on port :35972
- 16/08/01 19:31:20 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35972.
- 16/08/01 19:31:20 INFO NettyBlockTransferService: Server created on 35972
- 16/08/01 19:31:20 INFO BlockManagerMaster: Trying to register BlockManager
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 5545270822933119327 to /192.168.0.120:35679 took 1 ms
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5545270822933119327, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 496)}}
- 16/08/01 19:31:20 INFO BlockManagerMaster: Registered BlockManager
- 16/08/01 19:31:20 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:20 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=8252725586650066397, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 68, cap: 480)}}
- 16/08/01 19:31:20 TRACE TransportClient: Sending request 8252725586650066397 to /192.168.0.120:35679 took 7 ms
- 16/08/01 19:31:37 TRACE TransportClient: Sending RPC to /192.168.0.120:35679
- 16/08/01 19:31:37 TRACE TransportClient: Sending request 7329788795494669758 to /192.168.0.120:35679 took 1 ms
- 16/08/01 19:31:37 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=7329788795494669758, body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 13, widx: 94, cap: 94, components=2)}}
- 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 3720, cap: 3720, components=9)}}
- 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 0
- 16/08/01 19:31:44 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
- 16/08/01 19:31:44 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
- java.lang.IllegalStateException: unread block data
- at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
- at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 3720, cap: 3720, components=4)}}
- 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 1
- 16/08/01 19:31:44 INFO Executor: Running task 0.1 in stage 0.0 (TID 1)
- 16/08/01 19:31:44 ERROR Executor: Exception in task 0.1 in stage 0.0 (TID 1)
- java.lang.IllegalStateException: unread block data
- at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
- at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 3720, cap: 3720, components=2)}}
- 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 2
- 16/08/01 19:31:44 INFO Executor: Running task 0.2 in stage 0.0 (TID 2)
- 16/08/01 19:31:44 ERROR Executor: Exception in task 0.2 in stage 0.0 (TID 2)
- java.lang.IllegalStateException: unread block data
- at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
- at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/08/01 19:31:44 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 13, widx: 3728, cap: 8192)}}
- 16/08/01 19:31:44 INFO CoarseGrainedExecutorBackend: Got assigned task 3
- 16/08/01 19:31:44 INFO Executor: Running task 0.3 in stage 0.0 (TID 3)
- 16/08/01 19:31:44 ERROR Executor: Exception in task 0.3 in stage 0.0 (TID 3)
- java.lang.IllegalStateException: unread block data
- at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
- at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/08/01 19:31:47 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
- 16/08/01 19:31:47 TRACE TransportClient: Sending request 7710936371238132193 to tip.home/192.168.0.120:35679 took 1 ms
- 16/08/01 19:31:47 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=7710936371238132193, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 8192)}}
- 16/08/01 19:31:57 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
- 16/08/01 19:31:57 TRACE TransportClient: Sending request 6973944740578439248 to tip.home/192.168.0.120:35679 took 1 ms
- 16/08/01 19:31:57 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=6973944740578439248, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 8192)}}
- 16/08/01 19:32:07 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
- 16/08/01 19:32:07 TRACE TransportClient: Sending request 8343885097453980911 to tip.home/192.168.0.120:35679 took 1 ms
- 16/08/01 19:32:07 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=8343885097453980911, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 4096)}}
- 16/08/01 19:32:17 TRACE TransportClient: Sending RPC to tip.home/192.168.0.120:35679
- 16/08/01 19:32:17 TRACE TransportClient: Sending request 5641746415653792370 to tip.home/192.168.0.120:35679 took 1 ms
- 16/08/01 19:32:17 TRACE MessageDecoder: Received message RpcResponse: RpcResponse{requestId=5641746415653792370, body=NettyManagedBuffer{buf=PooledUnsafeDirectByteBuf(ridx: 21, widx: 102, cap: 4096)}}
- 16/08/01 19:32:26 TRACE MessageDecoder: Received message OneWayMessage: OneWayMessage{body=NettyManagedBuffer{buf=CompositeByteBuf(ridx: 5, widx: 1099, cap: 1099, components=2)}}
- 16/08/01 19:32:26 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown
- 16/08/01 19:32:26 INFO MemoryStore: MemoryStore cleared
- 16/08/01 19:32:26 INFO BlockManager: BlockManager stopped
- 16/08/01 19:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
- 16/08/01 19:32:26 WARN CoarseGrainedExecutorBackend: An unknown (tip.home:35679) driver disconnected.
- 16/08/01 19:32:26 ERROR CoarseGrainedExecutorBackend: Driver 192.168.0.120:35679 disassociated! Shutting down.
- 16/08/01 19:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement