Advertisement
Guest User

Untitled

a guest
Mar 6th, 2015
235
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.18 KB | None | 0 0
  1. customRDD.count
  2.  
  3. 15/03/06 23:02:32 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
  4. 15/03/06 23:02:32 ERROR TaskSetManager: Failed to serialize task 0, not attempting to retry it.
  5. java.lang.reflect.InvocationTargetException
  6. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  7. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  8. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  9. at java.lang.reflect.Method.invoke(Method.java:606)
  10. at org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$.getObjFieldValues$extension(SerializationDebugger.scala:240)
  11.  
  12. ...
  13.  
  14. Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
  15. at java.io.ObjectStreamClass$FieldReflector.getObjFieldValues(ObjectStreamClass.java:2050)
  16. at java.io.ObjectStreamClass.getObjFieldValues(ObjectStreamClass.java:1252)
  17. ... 45 more
  18.  
  19. import custom.rdd.stuff
  20. import org.apache.spark.SparkContext
  21.  
  22. val conf = sc.getConf
  23. conf.set(custom, parameters)
  24. sc.stop
  25. sc2 = new SparkContext(conf)
  26. val mapOfThings: Map[String, String] = ...
  27. myRdd = customRDD(sc2, mapOfStuff)
  28. myRdd.count
  29.  
  30. ... (exception output) ...
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement