Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Application
- Tools
- Configuration
- Local logs
- Server stacks
- Server metrics
- Log Type: directory.info
- Log Upload Time: Thu Jun 27 13:45:17 +0300 2019
- Log Length: 36026
- Showing 4096 bytes of 36026 total. Click here for the full log.
- 2018 ./__spark_libs__/xbean-asm5-shaded-4.4.jar
- 37438188 384 -r-xr-xr-x 1 yarn hadoop 390733 Dec 6 2018 ./__spark_libs__/parquet-format-2.3.1.jar
- 37438189 788 -r-xr-xr-x 1 yarn hadoop 805046 Dec 6 2018 ./__spark_libs__/zookeeper-3.4.6.3.1.0.0-78.jar
- 37438190 248 -r-xr-xr-x 1 yarn hadoop 250989 Dec 6 2018 ./__spark_libs__/parquet-hadoop-1.8.3.jar
- 37438191 120 -r-xr-xr-x 1 yarn hadoop 120512 Dec 6 2018 ./__spark_libs__/spark-repl_2.11-2.3.2.3.1.0.0-78.jar
- 37438192 2732 -r-xr-xr-x 1 yarn hadoop 2796935 Dec 6 2018 ./__spark_libs__/parquet-hadoop-bundle-1.6.0.jar
- 37438193 2280 -r-xr-xr-x 1 yarn hadoop 2333186 Dec 6 2018 ./__spark_libs__/zstd-jni-1.3.2-2.jar
- 37438194 1024 -r-xr-xr-x 1 yarn hadoop 1048115 Dec 6 2018 ./__spark_libs__/parquet-jackson-1.8.3.jar
- 37438195 524 -r-xr-xr-x 1 yarn hadoop 533455 Dec 6 2018 ./__spark_libs__/protobuf-java-2.5.0.jar
- 37438196 120 -r-xr-xr-x 1 yarn hadoop 122774 Dec 6 2018 ./__spark_libs__/py4j-0.10.7.jar
- 37438197 96 -r-xr-xr-x 1 yarn hadoop 94796 Dec 6 2018 ./__spark_libs__/pyrolite-4.13.jar
- 37438198 15128 -r-xr-xr-x 1 yarn hadoop 15487351 Dec 6 2018 ./__spark_libs__/scala-compiler-2.11.8.jar
- 37438199 5612 -r-xr-xr-x 1 yarn hadoop 5744974 Dec 6 2018 ./__spark_libs__/scala-library-2.11.8.jar
- 37438200 1776 -r-xr-xr-x 1 yarn hadoop 1818085 Dec 6 2018 ./__spark_libs__/spark-hive-thriftserver_2.11-2.3.2.3.1.0.0-78.jar
- 37438201 416 -r-xr-xr-x 1 yarn hadoop 423753 Dec 6 2018 ./__spark_libs__/scala-parser-combinators_2.11-1.0.4.jar
- 37438202 4468 -r-xr-xr-x 1 yarn hadoop 4573750 Dec 6 2018 ./__spark_libs__/scala-reflect-2.11.8.jar
- 37438203 656 -r-xr-xr-x 1 yarn hadoop 671138 Dec 6 2018 ./__spark_libs__/scala-xml_2.11-1.0.5.jar
- 37438204 788 -r-xr-xr-x 1 yarn hadoop 802818 Dec 6 2018 ./__spark_libs__/scalap-2.11.8.jar
- 37438205 3444 -r-xr-xr-x 1 yarn hadoop 3522616 Dec 6 2018 ./__spark_libs__/shapeless_2.11-2.3.2.jar
- 37438206 40 -r-xr-xr-x 1 yarn hadoop 40509 Dec 6 2018 ./__spark_libs__/slf4j-api-1.7.16.jar
- 37438207 12 -r-xr-xr-x 1 yarn hadoop 9939 Dec 6 2018 ./__spark_libs__/slf4j-log4j12-1.7.16.jar
- 37447136 48 -r-xr-xr-x 1 yarn hadoop 48720 Dec 6 2018 ./__spark_libs__/snappy-0.2.jar
- 37447137 1032 -r-xr-xr-x 1 yarn hadoop 1056168 Dec 6 2018 ./__spark_libs__/snappy-java-1.1.2.6.jar
- 37447138 1308 -r-xr-xr-x 1 yarn hadoop 1335820 Dec 6 2018 ./__spark_libs__/spark-hive_2.11-2.3.2.3.1.0.0-78.jar
- 37447139 8812 -r-xr-xr-x 1 yarn hadoop 9021968 Dec 6 2018 ./__spark_libs__/spark-catalyst_2.11-2.3.2.3.1.0.0-78.jar
- 37447140 52 -r-xr-xr-x 1 yarn hadoop 53042 Dec 6 2018 ./__spark_libs__/spark-kvstore_2.11-2.3.2.3.1.0.0-78.jar
- 37447141 12828 -r-xr-xr-x 1 yarn hadoop 13134090 Dec 6 2018 ./__spark_libs__/spark-core_2.11-2.3.2.3.1.0.0-78.jar
- 37447142 80 -r-xr-xr-x 1 yarn hadoop 80174 Dec 6 2018 ./__spark_libs__/spark-launcher_2.11-2.3.2.3.1.0.0-78.jar
- 37447143 696 -r-xr-xr-x 1 yarn hadoop 708861 Dec 6 2018 ./__spark_libs__/spark-graphx_2.11-2.3.2.3.1.0.0-78.jar
- 37447144 504 -r-xr-xr-x 1 yarn hadoop 515306 Dec 6 2018 ./__spark_libs__/spark-hadoop-cloud_2.11-2.3.2.3.1.0.0-78.jar
- 37447145 2328 -r-xr-xr-x 1 yarn hadoop 2381972 Dec 6 2018 ./__spark_libs__/spark-network-common_2.11-2.3.2.3.1.0.0-78.jar
- 37447146 68 -r-xr-xr-x 1 yarn hadoop 67640 Dec 6 2018 ./__spark_libs__/spark-network-shuffle_2.11-2.3.2.3.1.0.0-78.jar
- 37447147 32 -r-xr-xr-x 1 yarn hadoop 30094 Dec 6 2018 ./__spark_libs__/spark-sketch_2.11-2.3.2.3.1.0.0-78.jar
- 37447148 2124 -r-xr-xr-x 1 yarn hadoop 2171189 Dec 6 2018 ./__spark_libs__/spark-streaming_2.11-2.3.2.3.1.0.0-78.jar
- 37447149 48 -r-xr-xr-x 1 yarn hadoop 48971 Dec 6 2018 ./__spark_libs__/spark-unsafe_2.11-2.3.2.3.1.0.0-78.jar
- broken symlinks(find -L . -maxdepth 5 -type l -ls):
- Log Type: launch_container.sh
- Log Upload Time: Thu Jun 27 13:45:17 +0300 2019
- Log Length: 5949
- Showing 4096 bytes of 5949 total. Click here for the full log.
- rt NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="
- export NM_AUX_SERVICE_spark2_shuffle=""
- export SPARK_YARN_STAGING_DIR="hdfs://testcluster/user/ngorodnov/.sparkStaging/application_1560759961486_0455"
- export APP_SUBMIT_TIME_ENV="1561631756494"
- export TIMELINE_FLOW_NAME_TAG="ru.croc.rosbank.cri.system.Main"
- export TIMELINE_FLOW_VERSION_TAG="1"
- export PYTHONHASHSEED="0"
- export APPLICATION_WEB_PROXY_BASE="/proxy/application_1560759961486_0455"
- export CLASSPATH="$PWD:$PWD/__spark_conf__:$PWD/__spark_libs__/*:$HADOOP_CONF_DIR:/usr/hdp/3.1.0.0-78/hadoop/*:/usr/hdp/3.1.0.0-78/hadoop/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/3.1.0.0-78/hadoop/lib/hadoop-lzo-0.6.0.3.1.0.0-78.jar:/etc/hadoop/conf/secure:$PWD/__spark_conf__/__hadoop_conf__"
- export SPARK_USER="ngorodnov"
- export TIMELINE_FLOW_RUN_ID_TAG="1561631756495"
- echo "Setting up job resources"
- ln -sf "/hadoop/yarn/local/usercache/ngorodnov/filecache/172/agg_calculator-1.0.0-jar-with-dependencies.jar" "__app__.jar"
- ln -sf "/hadoop/yarn/local/usercache/ngorodnov/filecache/171/__spark_conf__.zip" "__spark_conf__"
- ln -sf "/hadoop/yarn/local/filecache/421/adj_calc.conf" "adj_calc.conf"
- ln -sf "/hadoop/yarn/local/filecache/11/spark2-hdp-hive-archive.tar.gz" "__hive_libs__"
- ln -sf "/hadoop/yarn/local/filecache/10/spark2-hdp-yarn-archive.tar.gz" "__spark_libs__"
- echo "Copying debugging information"
- # Creating copy of launch script
- cp "launch_container.sh" "/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/launch_container.sh"
- chmod 640 "/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/launch_container.sh"
- # Determining directory contents
- echo "ls -l:" 1>"/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/directory.info"
- ls -l 1>>"/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/directory.info"
- echo "find -L . -maxdepth 5 -ls:" 1>>"/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/directory.info"
- find -L . -maxdepth 5 -ls 1>>"/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/directory.info"
- echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/directory.info"
- find -L . -maxdepth 5 -type l -ls 1>>"/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/directory.info"
- echo "Launching container"
- exec /bin/bash -c "LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" $JAVA_HOME/bin/java -server -Xmx1024m -Djava.io.tmpdir=$PWD/tmp -Dhdp.version=3.1.0.0-78 -Dspark.yarn.app.container.log.dir=/hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001 org.apache.spark.deploy.yarn.ApplicationMaster --class 'ru.croc.rosbank.cri.system.Main' --jar file:/home/ngorodnov/MY_JAR/agg_calculator-1.0.0-jar-with-dependencies.jar --arg 'BRANCH_ID=0000' --arg 'BUSINESS_DATE=2019-02-14 21:00:00.0 ' --arg 'BATCH_ID=1' --arg 'DB_NAME=cri' --arg 'CONFIG=adj_calc.conf' --properties-file $PWD/__spark_conf__/__spark_conf__.properties 1> /hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/stdout 2> /hadoop/yarn/log/application_1560759961486_0455/container_e41_1560759961486_0455_01_000001/stderr"
- Log Type: prelaunch.err
- Log Upload Time: Thu Jun 27 13:45:17 +0300 2019
- Log Length: 0
- Log Type: prelaunch.out
- Log Upload Time: Thu Jun 27 13:45:17 +0300 2019
- Log Length: 100
- Setting up env variables
- Setting up job resources
- Copying debugging information
- Launching container
- Log Type: stderr
- Log Upload Time: Thu Jun 27 13:45:17 +0300 2019
- Log Length: 3408511
- Showing 4096 bytes of 3408511 total. Click here for the full log.
- erated in 12.868744 ms
- 19/06/27 13:40:25 INFO CodeGenerator: Code generated in 10.571408 ms
- 19/06/27 13:40:26 INFO CodeGenerator: Code generated in 22.256257 ms
- 19/06/27 13:40:33 ERROR ApplicationMaster: User class threw exception: java.lang.NullPointerException
- java.lang.NullPointerException
- at org.apache.spark.sql.Dataset$$anonfun$33.apply(Dataset.scala:2195)
- at org.apache.spark.sql.Dataset$$anonfun$33.apply(Dataset.scala:2195)
- at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
- at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
- at scala.collection.immutable.Map$Map1.foreach(Map.scala:116)
- at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
- at scala.collection.AbstractTraversable.map(Traversable.scala:104)
- at org.apache.spark.sql.Dataset.withColumns(Dataset.scala:2195)
- at org.apache.spark.sql.Dataset.withColumn(Dataset.scala:2164)
- at ru.croc.rosbank.cri.system.Main$.main(Main.scala:37)
- at ru.croc.rosbank.cri.system.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
- 19/06/27 13:40:33 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NullPointerException
- at org.apache.spark.sql.Dataset$$anonfun$33.apply(Dataset.scala:2195)
- at org.apache.spark.sql.Dataset$$anonfun$33.apply(Dataset.scala:2195)
- at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
- at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
- at scala.collection.immutable.Map$Map1.foreach(Map.scala:116)
- at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
- at scala.collection.AbstractTraversable.map(Traversable.scala:104)
- at org.apache.spark.sql.Dataset.withColumns(Dataset.scala:2195)
- at org.apache.spark.sql.Dataset.withColumn(Dataset.scala:2164)
- at ru.croc.rosbank.cri.system.Main$.main(Main.scala:37)
- at ru.croc.rosbank.cri.system.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
- )
- 19/06/27 13:40:33 INFO SparkContext: Invoking stop() from shutdown hook
- 19/06/27 13:40:33 INFO AbstractConnector: Stopped Spark@1ccedba9{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
- 19/06/27 13:40:33 INFO SparkUI: Stopped Spark web UI at http://ds03.localdomain:43771
- 19/06/27 13:40:33 INFO YarnAllocator: Driver requested a total number of 0 executor(s).
- 19/06/27 13:40:33 INFO YarnClusterSchedulerBackend: Shutting down all executors
- 19/06/27 13:40:33 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
- 19/06/27 13:40:33 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
- (serviceOption=None,
- services=List(),
- started=false)
- 19/06/27 13:40:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
- 19/06/27 13:40:33 INFO MemoryStore: MemoryStore cleared
- 19/06/27 13:40:33 INFO BlockManager: BlockManager stopped
- 19/06/27 13:40:33 INFO BlockManagerMaster: BlockManagerMaster stopped
- 19/06/27 13:40:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
- 19/06/27 13:40:33 INFO SparkContext: Successfully stopped SparkContext
- 19/06/27 13:40:33 INFO ShutdownHookManager: Shutdown hook called
- 19/06/27 13:40:33 INFO ShutdownHookManager: Deleting directory /hadoop/yarn/local/usercache/ngorodnov/appcache/application_1560759961486_0455/spark-545f870a-ad93-49e1-8855-ef7690991893
- Log Type: stdout
- Log Upload Time: Thu Jun 27 13:45:17 +0300 2019
- Log Length: 193
- INFO: Configuration was loaded from file: adj_calc.conf
- adj_cri.
- CRILIST_ACC IS LOADED
- Debug
- ==============================
- Debug step12FinalTransactionsTemplate
- ==============================
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement