Guest User

Untitled

a guest
Oct 18th, 2017
112
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 5.84 KB | None | 0 0
  1. ERROR: [pid 57194] Worker Worker(salt=783345934, workers=1, host=dev-03.jiwiredev.com, username=lblokhin, pid=48037, sudo_user=root) failed LciExtraction(campaign=21275, stat_date=2016-11-06, output_base_path=maprfs://mapr5/opt/lci/lblokhin, job_name=foo52, tvread=, saveastext=, tvwrite=)
  2. Traceback (most recent call last):
  3. File "/usr/local/lib/python2.7/dist-packages/luigi/worker.py", line 191, in run
  4. new_deps = self._run_get_new_deps()
  5. File "/usr/local/lib/python2.7/dist-packages/luigi/worker.py", line 129, in _run_get_new_deps
  6. task_gen = self.task.run()
  7. File "/code/ndlci/luigi/tasks/spark.py", line 491, in run
  8. spark_run_result = self.spark_run()
  9. File "/code/ndlci/luigi/lci/extraction.py", line 529, in spark_run
  10. results, expander_names = self._do_spark_run()
  11. File "/code/ndlci/luigi/lci/extraction.py", line 1214, in _do_spark_run
  12. users_count = retry_cldb(Users.count)
  13. File "/code/ndlci/luigi/targets/hdfs.py", line 30, in retry_cldb
  14. return f(*args, **kwargs)
  15. File "/opt/spark/current/python/pyspark/rdd.py", line 1008, in count
  16. return self.mapPartitions(lambda i: [sum(1 for _ in i)]).sum()
  17. File "/opt/spark/current/python/pyspark/rdd.py", line 999, in sum
  18. return self.mapPartitions(lambda x: [sum(x)]).fold(0, operator.add)
  19. File "/opt/spark/current/python/pyspark/rdd.py", line 873, in fold
  20. vals = self.mapPartitions(func).collect()
  21. File "/opt/spark/current/python/pyspark/rdd.py", line 776, in collect
  22. port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
  23. File "/opt/spark/current/python/lib/py4j-0.10.3-src.zip/py4j/java_gateway.py", line 1133, in __call__
  24. answer, self.gateway_client, self.target_id, self.name)
  25. File "/opt/spark/current/python/pyspark/sql/utils.py", line 63, in deco
  26. return f(*a, **kw)
  27. File "/opt/spark/current/python/lib/py4j-0.10.3-src.zip/py4j/protocol.py", line 319, in get_return_value
  28. format(target_id, ".", name), value)
  29. Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
  30. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 23 in stage 10.0 failed 5 times, most recent failure: Lost task 23.4 in stage 10.0 (TID 6328, mapr5-209.jiwiredev.com): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  31. File "/opt/spark/current/python/lib/pyspark.zip/pyspark/worker.py", line 159, in main
  32. func, profiler, deserializer, serializer = read_udfs(pickleSer, infile)
  33. File "/opt/spark/current/python/lib/pyspark.zip/pyspark/worker.py", line 97, in read_udfs
  34. arg_offsets, udf = read_single_udf(pickleSer, infile)
  35. File "/opt/spark/current/python/lib/pyspark.zip/pyspark/worker.py", line 78, in read_single_udf
  36. f, return_type = read_command(pickleSer, infile)
  37. File "/opt/spark/current/python/lib/pyspark.zip/pyspark/worker.py", line 54, in read_command
  38. command = serializer._read_with_length(file)
  39. File "/opt/spark/current/python/lib/pyspark.zip/pyspark/serializers.py", line 164, in _read_with_length
  40. return self.loads(obj)
  41. File "/opt/spark/current/python/lib/pyspark.zip/pyspark/serializers.py", line 422, in loads
  42. return pickle.loads(obj)
  43. File "build/bdist.linux-x86_64/egg/ndlci/luigi/lci/__init__.py", line 10, in <module>
  44. File "build/bdist.linux-x86_64/egg/ndlci/luigi/lci/pca.py", line 9, in <module>
  45. ImportError: No module named jiwire_reports.engines.mysql_db
  46.  
  47. at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193)
  48. at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234)
  49. at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152)
  50. at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1.apply(BatchEvalPythonExec.scala:124)
  51. at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1.apply(BatchEvalPythonExec.scala:68)
  52. at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
  53. at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
  54. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  55. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
  56. at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
  57. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  58. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
  59. at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
  60. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  61. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
  62. at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
  63. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  64. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
  65. at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
  66. at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63)
  67. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
  68. at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
  69. at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:390)
  70. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
  71. at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
  72. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
  73. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
  74. at org.apache.spark.scheduler.Task.run(Task.scala:86)
  75. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
  76. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  77. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  78. at java.lang.Thread.run(Thread.java:745)
Add Comment
Please, Sign In to add comment