Advertisement
Guest User

Untitled

a guest
Jul 29th, 2016
55
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.95 KB | None | 0 0
  1. id1, id2
  2. id2, id3
  3. ...
  4. idnnnn, idnnnn+1
  5.  
  6. joinData = graphTuples.join(listIds)
  7.  
  8. Job aborted due to stage failure: Task 351 in stage 2.0 failed 4 times, most recent failure: Lost task 351.3 in stage 2.0 (TID 476, c536.ant-net): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  9. File "/data/3/tmp/hadoop-hadoop/nm-local-dir/usercache/user/appcache/application_1468851295159_0020/container_1468851295159_0020_01_000016/pyspark.zip/pyspark/worker.py", line 111, in main
  10. process()
  11. File "/data/3/tmp/hadoop-hadoop/nm-local-dir/usercache/user/appcache/application_1468851295159_0020/container_1468851295159_0020_01_000016/pyspark.zip/pyspark/worker.py", line 106, in process
  12. serializer.dump_stream(func(split_index, iterator), outfile)
  13. File "/data/3/tmp/hadoop-hadoop/nm-local-dir/usercache/user/appcache/application_1468851295159_0020/container_1468851295159_0020_01_000016/pyspark.zip/pyspark/serializers.py", line 263, in dump_stream
  14. vs = list(itertools.islice(iterator, batch))
  15. File "/usr/local/spark/python/pyspark/rdd.py", line 1898, in <lambda>
  16. IndexError: list index out of range
  17.  
  18. at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:166)
  19. at org.apache.spark.api.python.PythonRunner$$anon$1.next(PythonRDD.scala:129)
  20. at org.apache.spark.api.python.PythonRunner$$anon$1.next(PythonRDD.scala:125)
  21. at org.apache.spark.InterruptibleIterator.next(InterruptibleIterator.scala:43)
  22. at scala.collection.Iterator$class.foreach(Iterator.scala:727)
  23. at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
  24. at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452)
  25. at org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280)
  26. at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1765)
  27. at org.apache.spark.api.python.PythonRunner$WriterThread.run(PythonRDD.scala:239)
  28.  
  29. Driver stacktrace:
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement