Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- customRDD.count
- 15/03/06 23:02:32 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
- 15/03/06 23:02:32 ERROR TaskSetManager: Failed to serialize task 0, not attempting to retry it.
- java.lang.reflect.InvocationTargetException
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:606)
- at org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$.getObjFieldValues$extension(SerializationDebugger.scala:240)
- ...
- Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
- at java.io.ObjectStreamClass$FieldReflector.getObjFieldValues(ObjectStreamClass.java:2050)
- at java.io.ObjectStreamClass.getObjFieldValues(ObjectStreamClass.java:1252)
- ... 45 more
- import custom.rdd.stuff
- import org.apache.spark.SparkContext
- val conf = sc.getConf
- conf.set(custom, parameters)
- sc.stop
- sc2 = new SparkContext(conf)
- val mapOfThings: Map[String, String] = ...
- myRdd = customRDD(sc2, mapOfStuff)
- myRdd.count
- ... (exception output) ...
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement