Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- =~=~=~=~=~=~=~=~=~=~=~= PuTTY log 2019.11.05 08:18:19 =~=~=~=~=~=~=~=~=~=~=~=
- SLF4J: Class path contains multiple SLF4J bindings.
- SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
- SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]
- ---- CREATING SPARK Session:
- warehouseLocation:/data2/spark-warehouse
- +------------+--------+--------------------+----+------------+
- | fwSerial|panosver| csvpath|size|afterProcess|
- +------------+--------+--------------------+----+------------+
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4895| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5028| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2366| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2038| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4854| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2796| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5264| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4414| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4721| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2376| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2622| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5039| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2499| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4742| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2468| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4660| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2335| Delete|
- +------------+--------+--------------------+----+------------+
- Memory: 1160m
- LogCollector&Compacter called with the following parameters:
- Parameters for execution
- Master[processes]:............ local[1]
- Available RAM (MB):........... 1187840
- User:......................... admin
- debug:........................ false
- Parameters for Job Connections
- Task ID:...................... 2106
- My IP:........................ 10.4.23.43
- Expedition IP:................ 10.4.23.43:3306
- Time Zone:.................... Europe/Helsinki
- dbUser (dbPassword):.......... root (************)
- projectName:.................. demo
- Parameters for Data Sources
- App Categories (source):........ (Expedition)
- CSV Files Path:................./tmp/1572897807_traffic_files.csv
- Parquet output path:.......... file:///myLogs/connections.parquet
- Temporary folder:............. /data2
- ---- AppID DB LOAD:
- Application Categories loading...
- Application Categories loaded
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- | fwSerial|panosver| csvpath|size|afterProcess| grouped|row|accumulatedSize|
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4895| Delete|grouping| 1| 4895.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|grouping| 2| 9596.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|grouping| 3| 14061.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|grouping| 4| 18762.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5028| Delete|grouping| 5| 23790.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2366| Delete|grouping| 6| 26156.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2038| Delete|grouping| 7| 28194.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4854| Delete|grouping| 8| 33048.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2796| Delete|grouping| 9| 35844.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|grouping| 10| 40309.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5264| Delete|grouping| 11| 45573.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4414| Delete|grouping| 12| 49987.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4721| Delete|grouping| 13| 54708.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|grouping| 14| 59654.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2376| Delete|grouping| 15| 62030.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|grouping| 16| 66915.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2622| Delete|grouping| 17| 69537.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5039| Delete|grouping| 18| 74576.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|grouping| 19| 79461.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2499| Delete|grouping| 20| 81960.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4742| Delete|grouping| 21| 86702.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2468| Delete|grouping| 22| 89170.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4660| Delete|grouping| 23| 93830.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|grouping| 24| 98776.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2335| Delete|grouping| 25| 101111.0|
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- Selection criteria: 0 < accumulatedSize and accumulatedSize <= 1187840
- Processing from lowLimit:0 to highLimit:1187840 with StepLine:1187840
- Few logs can fit in this batch:25
- 8.1.0:/myLogs/PCC-CORP-PA1_traffic_2019_10_25_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_04_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_31_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_27_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_22_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_30_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_29_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_15_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_16_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_02_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_21_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_18_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_20_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_12_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_03_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_01_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_17_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_14_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_26_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_13_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_11_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_24_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_28_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_19_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_23_last_calendar_day.csv
- Logs of format 7.1.x NOT found
- Logs of format 8.0.2 NOT found
- Logs of format 8.1.0-beta17 NOT found
- Logs of format 8.1.0 found
- Logs of format 9.0.0 NOT found
- Logs of format 9.1.0-beta NOT found
- Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 595 in stage 188.0 failed 1 times, most recent failure: Lost task 595.0 in stage 188.0 (TID 6943, localhost, executor driver): com.esotericsoftware.kryo.KryoException: java.io.IOException: No space left on device
- Serialization trace:
- buffers (org.apache.spark.sql.execution.columnar.CachedBatch)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:188)
- at com.esotericsoftware.kryo.io.Output.require(Output.java:164)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:251)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:237)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:49)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:38)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:332)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:302)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:86)
- at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:508)
- at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:651)
- at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:241)
- at org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:140)
- at org.apache.spark.serializer.SerializerManager.dataSerializeStream(SerializerManager.scala:174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.DiskStore.put(DiskStore.scala:69)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
- at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
- at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
- at org.apache.spark.scheduler.Task.run(Task.scala:121)
- at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
- at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
- at java.lang.Thread.run(Thread.java:748)
- Caused by: java.io.IOException: No space left on device
- at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
- at sun.nio.ch.FileDispatcherImpl.write(FileDispatcherImpl.java:60)
- at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
- at sun.nio.ch.IOUtil.write(IOUtil.java:65)
- at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:211)
- at org.apache.spark.storage.CountingWritableChannel.write(DiskStore.scala:332)
- at java.nio.channels.Channels.writeFullyImpl(Channels.java:78)
- at java.nio.channels.Channels.writeFully(Channels.java:101)
- at java.nio.channels.Channels.access$000(Channels.java:61)
- at java.nio.channels.Channels$1.write(Channels.java:174)
- at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
- at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:186)
- ... 46 more
- Driver stacktrace:
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
- at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
- at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
- at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
- at scala.Option.foreach(Option.scala:257)
- at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
- at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
- at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
- at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)
- at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
- at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
- at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
- at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
- at com.paloaltonetworks.tbd.LogCollectorCompacter$.processSubFiles(LogCollectorCompacter.scala:1355)
- at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:471)
- at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: com.esotericsoftware.kryo.KryoException: java.io.IOException: No space left on device
- Serialization trace:
- buffers (org.apache.spark.sql.execution.columnar.CachedBatch)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:188)
- at com.esotericsoftware.kryo.io.Output.require(Output.java:164)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:251)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:237)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:49)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:38)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:332)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:302)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:86)
- at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:508)
- at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:651)
- at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:241)
- at org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:140)
- at org.apache.spark.serializer.SerializerManager.dataSerializeStream(SerializerManager.scala:174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.DiskStore.put(DiskStore.scala:69)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
- at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
- at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
- at org.apache.spark.scheduler.Task.run(Task.scala:121)
- at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
- at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
- at java.lang.Thread.run(Thread.java:748)
- Caused by: java.io.IOException: No space left on device
- at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
- at sun.nio.ch.FileDispatcherImpl.write(FileDispatcherImpl.java:60)
- at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
- at sun.nio.ch.IOUtil.write(IOUtil.java:65)
- at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:211)
- at org.apache.spark.storage.CountingWritableChannel.write(DiskStore.scala:332)
- at java.nio.channels.Channels.writeFullyImpl(Channels.java:78)
- at java.nio.channels.Channels.writeFully(Channels.java:101)
- at java.nio.channels.Channels.access$000(Channels.java:61)
- at java.nio.channels.Channels$1.write(Channels.java:174)
- at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
- at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:186)
- ... 46 more
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- SLF4J: Class path contains multiple SLF4J bindings.
- SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
- SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]
- ---- CREATING SPARK Session:
- warehouseLocation:/data2/spark-warehouse
- +------------+--------+--------------------+----+------------+
- | fwSerial|panosver| csvpath|size|afterProcess|
- +------------+--------+--------------------+----+------------+
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4895| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5028| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2366| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2038| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4854| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2796| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5264| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4414| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4721| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2376| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2622| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5039| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2499| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4742| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2468| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4660| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2335| Delete|
- +------------+--------+--------------------+----+------------+
- Memory: 1160m
- LogCollector&Compacter called with the following parameters:
- Parameters for execution
- Master[processes]:............ local[1]
- Available RAM (MB):........... 1187840
- User:......................... admin
- debug:........................ false
- Parameters for Job Connections
- Task ID:...................... 2107
- My IP:........................ 10.4.23.43
- Expedition IP:................ 10.4.23.43:3306
- Time Zone:.................... Europe/Helsinki
- dbUser (dbPassword):.......... root (************)
- projectName:.................. demo
- Parameters for Data Sources
- App Categories (source):........ (Expedition)
- CSV Files Path:................./tmp/1572904912_traffic_files.csv
- Parquet output path:.......... file:///myLogs/connections.parquet
- Temporary folder:............. /data2
- ---- AppID DB LOAD:
- Application Categories loading...
- Application Categories loaded
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- | fwSerial|panosver| csvpath|size|afterProcess| grouped|row|accumulatedSize|
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4895| Delete|grouping| 1| 4895.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|grouping| 2| 9596.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|grouping| 3| 14061.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|grouping| 4| 18762.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5028| Delete|grouping| 5| 23790.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2366| Delete|grouping| 6| 26156.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2038| Delete|grouping| 7| 28194.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4854| Delete|grouping| 8| 33048.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2796| Delete|grouping| 9| 35844.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|grouping| 10| 40309.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5264| Delete|grouping| 11| 45573.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4414| Delete|grouping| 12| 49987.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4721| Delete|grouping| 13| 54708.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|grouping| 14| 59654.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2376| Delete|grouping| 15| 62030.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|grouping| 16| 66915.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2622| Delete|grouping| 17| 69537.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5039| Delete|grouping| 18| 74576.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|grouping| 19| 79461.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2499| Delete|grouping| 20| 81960.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4742| Delete|grouping| 21| 86702.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2468| Delete|grouping| 22| 89170.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4660| Delete|grouping| 23| 93830.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|grouping| 24| 98776.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2335| Delete|grouping| 25| 101111.0|
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- Selection criteria: 0 < accumulatedSize and accumulatedSize <= 1187840
- Processing from lowLimit:0 to highLimit:1187840 with StepLine:1187840
- Few logs can fit in this batch:25
- 8.1.0:/myLogs/PCC-CORP-PA1_traffic_2019_10_25_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_04_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_31_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_27_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_22_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_30_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_29_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_15_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_16_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_02_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_21_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_18_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_20_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_12_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_03_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_01_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_17_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_14_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_26_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_13_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_11_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_24_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_28_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_19_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_23_last_calendar_day.csv
- Logs of format 7.1.x NOT found
- Logs of format 8.0.2 NOT found
- Logs of format 8.1.0-beta17 NOT found
- Logs of format 8.1.0 found
- Logs of format 9.0.0 NOT found
- Logs of format 9.1.0-beta NOT found
- Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 595 in stage 188.0 failed 1 times, most recent failure: Lost task 595.0 in stage 188.0 (TID 6943, localhost, executor driver): com.esotericsoftware.kryo.KryoException: java.io.IOException: No space left on device
- Serialization trace:
- buffers (org.apache.spark.sql.execution.columnar.CachedBatch)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:188)
- at com.esotericsoftware.kryo.io.Output.require(Output.java:164)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:251)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:237)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:49)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:38)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:332)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:302)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:86)
- at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:508)
- at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:651)
- at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:241)
- at org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:140)
- at org.apache.spark.serializer.SerializerManager.dataSerializeStream(SerializerManager.scala:174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.DiskStore.put(DiskStore.scala:69)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
- at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
- at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
- at org.apache.spark.scheduler.Task.run(Task.scala:121)
- at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
- at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
- at java.lang.Thread.run(Thread.java:748)
- Caused by: java.io.IOException: No space left on device
- at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
- at sun.nio.ch.FileDispatcherImpl.write(FileDispatcherImpl.java:60)
- at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
- at sun.nio.ch.IOUtil.write(IOUtil.java:65)
- at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:211)
- at org.apache.spark.storage.CountingWritableChannel.write(DiskStore.scala:332)
- at java.nio.channels.Channels.writeFullyImpl(Channels.java:78)
- at java.nio.channels.Channels.writeFully(Channels.java:101)
- at java.nio.channels.Channels.access$000(Channels.java:61)
- at java.nio.channels.Channels$1.write(Channels.java:174)
- at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
- at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:186)
- ... 46 more
- Driver stacktrace:
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
- at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
- at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
- at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
- at scala.Option.foreach(Option.scala:257)
- at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
- at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
- at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
- at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)
- at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
- at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
- at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
- at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
- at com.paloaltonetworks.tbd.LogCollectorCompacter$.processSubFiles(LogCollectorCompacter.scala:1355)
- at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:471)
- at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: com.esotericsoftware.kryo.KryoException: java.io.IOException: No space left on device
- Serialization trace:
- buffers (org.apache.spark.sql.execution.columnar.CachedBatch)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:188)
- at com.esotericsoftware.kryo.io.Output.require(Output.java:164)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:251)
- at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:237)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:49)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:38)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:332)
- at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:302)
- at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:629)
- at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:86)
- at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:508)
- at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:651)
- at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:241)
- at org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:140)
- at org.apache.spark.serializer.SerializerManager.dataSerializeStream(SerializerManager.scala:174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1174)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$7.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.DiskStore.put(DiskStore.scala:69)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1172)
- at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
- at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
- at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
- at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
- at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
- at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
- at org.apache.spark.scheduler.Task.run(Task.scala:121)
- at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
- at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
- at java.lang.Thread.run(Thread.java:748)
- Caused by: java.io.IOException: No space left on device
- at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
- at sun.nio.ch.FileDispatcherImpl.write(FileDispatcherImpl.java:60)
- at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
- at sun.nio.ch.IOUtil.write(IOUtil.java:65)
- at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:211)
- at org.apache.spark.storage.CountingWritableChannel.write(DiskStore.scala:332)
- at java.nio.channels.Channels.writeFullyImpl(Channels.java:78)
- at java.nio.channels.Channels.writeFully(Channels.java:101)
- at java.nio.channels.Channels.access$000(Channels.java:61)
- at java.nio.channels.Channels$1.write(Channels.java:174)
- at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
- at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
- at com.esotericsoftware.kryo.io.Output.flush(Output.java:186)
- ... 46 more
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- PHP Notice: Trying to get property of non-object in /var/www/html/API/convertCSVtoParquet_afterProcess.php on line 22
- SLF4J: Class path contains multiple SLF4J bindings.
- SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
- SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
- SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]
- ---- CREATING SPARK Session:
- warehouseLocation:/data2/spark-warehouse
- +------------+--------+--------------------+----+------------+
- | fwSerial|panosver| csvpath|size|afterProcess|
- +------------+--------+--------------------+----+------------+
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4895| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5028| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2366| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2038| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4854| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2796| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5264| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4414| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4721| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2376| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2622| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5039| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2499| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4875| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4742| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2468| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4660| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2335| Delete|
- +------------+--------+--------------------+----+------------+
- Memory: 1160m
- LogCollector&Compacter called with the following parameters:
- Parameters for execution
- Master[processes]:............ local[1]
- Available RAM (MB):........... 1187840
- User:......................... admin
- debug:........................ false
- Parameters for Job Connections
- Task ID:...................... 2110
- My IP:........................ 10.4.23.43
- Expedition IP:................ 10.4.23.43:3306
- Time Zone:.................... Europe/Helsinki
- dbUser (dbPassword):.......... root (************)
- projectName:.................. demo
- Parameters for Data Sources
- App Categories (source):........ (Expedition)
- CSV Files Path:................./tmp/1572963212_traffic_files.csv
- Parquet output path:.......... file:///myLogs/connections.parquet
- Temporary folder:............. /data2
- ---- AppID DB LOAD:
- Application Categories loading...
- Application Categories loaded
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- | fwSerial|panosver| csvpath|size|afterProcess| grouped|row|accumulatedSize|
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4895| Delete|grouping| 1| 4895.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|grouping| 2| 9596.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|grouping| 3| 14061.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4701| Delete|grouping| 4| 18762.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5028| Delete|grouping| 5| 23790.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2366| Delete|grouping| 6| 26156.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2038| Delete|grouping| 7| 28194.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4854| Delete|grouping| 8| 33048.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2796| Delete|grouping| 9| 35844.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4465| Delete|grouping| 10| 40309.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5264| Delete|grouping| 11| 45573.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4414| Delete|grouping| 12| 49987.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4721| Delete|grouping| 13| 54708.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|grouping| 14| 59654.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2376| Delete|grouping| 15| 62030.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|grouping| 16| 66915.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2622| Delete|grouping| 17| 69537.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|5039| Delete|grouping| 18| 74576.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4885| Delete|grouping| 19| 79461.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2499| Delete|grouping| 20| 81960.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4875| Delete|grouping| 21| 86835.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4742| Delete|grouping| 22| 91577.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2468| Delete|grouping| 23| 94045.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4660| Delete|grouping| 24| 98705.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|4946| Delete|grouping| 25| 103651.0|
- |001801019445| 8.1.0|/myLogs/PCC-CORP-...|2335| Delete|grouping| 26| 105986.0|
- +------------+--------+--------------------+----+------------+--------+---+---------------+
- Selection criteria: 0 < accumulatedSize and accumulatedSize <= 1187840
- Processing from lowLimit:0 to highLimit:1187840 with StepLine:1187840
- Few logs can fit in this batch:26
- 8.1.0:/myLogs/PCC-CORP-PA1_traffic_2019_10_25_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_04_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_31_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_27_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_22_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_30_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_29_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_15_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_16_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_02_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_21_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_18_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_20_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_12_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_03_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_01_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_17_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_14_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_11_05_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_26_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_13_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_11_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_24_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_28_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_19_last_calendar_day.csv,/myLogs/PCC-CORP-PA1_traffic_2019_10_23_last_calendar_day.csv
- Logs of format 7.1.x NOT found
- Logs of format 8.0.2 NOT found
- Logs of format 8.1.0-beta17 NOT found
- Logs of format 8.1.0 found
- Logs of format 9.0.0 NOT found
- Logs of format 9.1.0-beta NOT found
- ]0;expedition@Expedition: ~expedition@Expedition:~$
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement