Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- work@sampo-think ~/workspace/wellmo-reporting $ java -jar target/scala-2.10/wellmo-reporting-assembly-1.0.jar interactive
- 14/03/06 11:13:02 WARN Utils: Your hostname, sampo-think resolves to a loopback address: 127.0.1.1; using 172.16.168.1 instead (on interface vmnet8)
- 14/03/06 11:13:02 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
- 14/03/06 11:13:03 INFO Slf4jLogger: Slf4jLogger started
- 14/03/06 11:13:03 INFO Remoting: Starting remoting
- 14/03/06 11:13:03 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@172.16.168.1:33053]
- 14/03/06 11:13:03 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@172.16.168.1:33053]
- 14/03/06 11:13:03 INFO SparkEnv: Registering BlockManagerMaster
- 14/03/06 11:13:03 DEBUG DiskBlockManager: Creating local directories at root dirs '/tmp'
- 14/03/06 11:13:03 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20140306111303-ae28
- 14/03/06 11:13:03 INFO MemoryStore: MemoryStore started with capacity 2.1 GB.
- 14/03/06 11:13:03 INFO ConnectionManager: Bound socket to port 43022 with id = ConnectionManagerId(172.16.168.1,43022)
- 14/03/06 11:13:03 INFO BlockManagerMaster: Trying to register BlockManager
- 14/03/06 11:13:03 INFO BlockManagerMasterActor$BlockManagerInfo: Registering block manager 172.16.168.1:43022 with 2.1 GB RAM
- 14/03/06 11:13:03 INFO BlockManagerMaster: Registered BlockManager
- 14/03/06 11:13:03 INFO HttpServer: Starting HTTP Server
- 14/03/06 11:13:03 INFO HttpBroadcast: Broadcast server started at http://172.16.168.1:56274
- 14/03/06 11:13:03 INFO SparkEnv: Registering MapOutputTracker
- 14/03/06 11:13:03 INFO HttpFileServer: HTTP File server directory is /tmp/spark-551a7446-963c-485c-911a-191c90ddf384
- 14/03/06 11:13:03 INFO HttpServer: Starting HTTP Server
- 14/03/06 11:13:03 INFO SparkUI: Started Spark Web UI at http://172.16.168.1:4040
- 14/03/06 11:13:03 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
- 14/03/06 11:13:03 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
- 14/03/06 11:13:03 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
- 14/03/06 11:13:03 DEBUG KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
- 14/03/06 11:13:03 DEBUG Groups: Creating new Groups object
- 14/03/06 11:13:03 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
- 14/03/06 11:13:03 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
- 14/03/06 11:13:03 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
- 14/03/06 11:13:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 14/03/06 11:13:03 DEBUG JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
- 14/03/06 11:13:03 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
- 14/03/06 11:13:03 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
- 14/03/06 11:13:04 INFO MemoryStore: ensureFreeSpace(140816) called with curMem=0, maxMem=2202324172
- 14/03/06 11:13:04 INFO MemoryStore: Block broadcast_0 stored as values to memory (estimated size 137.5 KB, free 2.1 GB)
- 14/03/06 11:13:04 DEBUG BlockManager: Put block broadcast_0 locally took 74 ms
- 14/03/06 11:13:04 DEBUG BlockManager: Put for block broadcast_0 without replication took 74 ms
- 14/03/06 11:13:04 INFO MemoryStore: ensureFreeSpace(140878) called with curMem=140816, maxMem=2202324172
- 14/03/06 11:13:04 INFO MemoryStore: Block broadcast_1 stored as values to memory (estimated size 137.6 KB, free 2.1 GB)
- 14/03/06 11:13:04 DEBUG BlockManager: Put block broadcast_1 locally took 12 ms
- 14/03/06 11:13:04 DEBUG BlockManager: Put for block broadcast_1 without replication took 12 ms
- 14/03/06 11:13:04 INFO MemoryStore: ensureFreeSpace(140878) called with curMem=281694, maxMem=2202324172
- 14/03/06 11:13:04 INFO MemoryStore: Block broadcast_2 stored as values to memory (estimated size 137.6 KB, free 2.1 GB)
- 14/03/06 11:13:04 DEBUG BlockManager: Put block broadcast_2 locally took 13 ms
- 14/03/06 11:13:04 DEBUG BlockManager: Put for block broadcast_2 without replication took 13 ms
- 14/03/06 11:13:04 DEBUG UserGroupInformation: hadoop login
- 14/03/06 11:13:04 DEBUG UserGroupInformation: hadoop login commit
- 14/03/06 11:13:04 DEBUG UserGroupInformation: using local user:UnixPrincipal: work
- 14/03/06 11:13:04 DEBUG UserGroupInformation: UGI loginUser:work (auth:SIMPLE)
- 14/03/06 11:13:04 INFO MongoInputFormat: Using com.mongodb.hadoop.splitter.StandaloneMongoSplitter@50e3fa1d to calculate splits.
- 14/03/06 11:13:04 INFO StandaloneMongoSplitter: Running splitvector to check splits against mongodb://127.0.0.1:27017/wellness.Data
- 14/03/06 11:13:04 WARN StandaloneMongoSplitter: WARNING: No Input Splits were calculated by the split code. Proceeding with a *single* split. Data may be too small, try lowering 'mongo.input.split_size' if this is undesirable.
- 14/03/06 11:13:04 INFO MongoCollectionSplitter: Created split: min=null, max= null
- 14/03/06 11:13:04 INFO MongoInputFormat: Using com.mongodb.hadoop.splitter.StandaloneMongoSplitter@4cdae43e to calculate splits.
- 14/03/06 11:13:04 INFO StandaloneMongoSplitter: Running splitvector to check splits against mongodb://127.0.0.1:27017/wellness.Profile
- 14/03/06 11:13:04 WARN StandaloneMongoSplitter: WARNING: No Input Splits were calculated by the split code. Proceeding with a *single* split. Data may be too small, try lowering 'mongo.input.split_size' if this is undesirable.
- 14/03/06 11:13:04 INFO MongoCollectionSplitter: Created split: min=null, max= null
- 14/03/06 11:13:04 INFO MongoInputFormat: Using com.mongodb.hadoop.splitter.StandaloneMongoSplitter@220bd7cb to calculate splits.
- 14/03/06 11:13:04 INFO StandaloneMongoSplitter: Running splitvector to check splits against mongodb://127.0.0.1:27017/wellness.Analytics
- 14/03/06 11:13:04 WARN StandaloneMongoSplitter: WARNING: No Input Splits were calculated by the split code. Proceeding with a *single* split. Data may be too small, try lowering 'mongo.input.split_size' if this is undesirable.
- 14/03/06 11:13:04 INFO MongoCollectionSplitter: Created split: min=null, max= null
- 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[7] at map at Mongo.scala:30
- 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[8] at map at Mongo.scala:32
- 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[7] at map at Mongo.scala:30
- 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[13] at map at Mongo.scala:33
- Starting the interactive shell
- 14/03/06 11:13:04 INFO HttpServer: Starting HTTP Server
- [running phase parser on <init>]
- [running phase namer on <init>]
- [running phase packageobjects on <init>]
- [running phase typer on <init>]
- [running phase patmat on <init>]
- [running phase repl on <init>]
- [running phase superaccessors on <init>]
- [running phase extmethods on <init>]
- [running phase pickler on <init>]
- [running phase refchecks on <init>]
- [running phase uncurry on <init>]
- [running phase tailcalls on <init>]
- [running phase specialize on <init>]
- [running phase explicitouter on <init>]
- [running phase erasure on <init>]
- [running phase posterasure on <init>]
- [running phase lazyvals on <init>]
- [running phase lambdalift on <init>]
- [running phase constructors on <init>]
- [running phase flatten on <init>]
- [running phase mixin on <init>]
- [running phase cleanup on <init>]
- [running phase icode on <init>]
- [running phase inliner on <init>]
- [running phase inlineExceptionHandlers on <init>]
- [running phase closelim on <init>]
- [running phase dce on <init>]
- [running phase jvm on icode]
- 14/03/06 11:13:06 DEBUG Repl$$anon$1: Clearing 6 thunks.
- 14/03/06 11:13:06 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- var value: org.apache.spark.repl.SparkIMain = _
- def set(x: Any) = value = x.asInstanceOf[org.apache.spark.repl.SparkIMain]
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- <mutable> <defaultinit> var value: org.apache.spark.repl.SparkIMain = _;
- def set(x: Any) = value = x.asInstanceOf[org.apache.spark.repl.SparkIMain]
- }))
- 14/03/06 11:13:06 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- <mutable> <defaultinit> var value: org.apache.spark.repl.SparkIMain = _;
- def set(x: Any) = value = x.asInstanceOf[org.apache.spark.repl.SparkIMain]
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line1.$eval.set(java.lang.Object)
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: with args: org.apache.spark.repl.SparkILoop$SparkILoopInterpreter@6afd37be
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val $intp = $line1.$eval.value
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val $intp = $line1.$eval.value
- ") Some(List(val $intp = $line1.$eval.value))
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
- 11: TypeTree
- 32: Select
- 26: Select
- 19: Ident
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- val $intp = $line1.$eval.value
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val $intp = $line1.$eval.value
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val $intp = $line1.$eval.value
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $intp to <method> <stable> <accessor> val $intp(): org.apache.spark.repl.SparkIMain
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line2.$read.INSTANCE.$iw.$iw.`$intp`
- val $print: String = {
- $read.INSTANCE.$iw.$iw
- (""
- + "$intp: repl.this.SparkIMain = " + scala.runtime.ScalaRunTime.replStringOf($line2.$read.INSTANCE.$iw.$iw.`$intp`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line2.$read.INSTANCE.$iw.$iw.$intp;
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".$plus("$intp: repl.this.SparkIMain = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line2.$read.INSTANCE.$iw.$iw.$intp, 1000))
- }
- }))
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line2.$read.INSTANCE.$iw.$iw.$intp;
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".+("$intp: repl.this.SparkIMain = ").+(scala.runtime.ScalaRunTime.replStringOf($line2.$read.INSTANCE.$iw.$iw.$intp, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line2.$eval.$print()
- Welcome to
- ____ __
- / __/__ ___ _____/ /__
- _\ \/ _ \/ _ `/ __/ '_/
- /___/ .__/\_,_/_/ /_/\_\ version 0.9.0
- /_/
- Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_51)
- Type in expressions to have them evaluated.
- Type :help for more information.
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- @transient val sc = org.apache.spark.repl.Main.interp.createSparkContext();
- ") Some(List(@new transient.<init>() val sc = org.apache.spark.repl.Main.interp.createSparkContext()))
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: 39: ValDef
- 25: Apply
- 25: Select
- 25: New
- 25: Ident
- 39: TypeTree
- 96: Apply
- 78: Select
- 71: Select
- 66: Select
- 61: Select
- 55: Select
- 48: Select
- 44: Ident
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- @transient val sc = org.apache.spark.repl.Main.interp.createSparkContext();
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- @new transient.<init>() val sc = org.apache.spark.repl.Main.interp.createSparkContext()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- @new transient.<init>() val sc = org.apache.spark.repl.Main.interp.createSparkContext
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of sc to <method> <stable> <accessor> val sc(): org.apache.spark.SparkContext
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line3.$read.INSTANCE.$iw.$iw.`sc`
- val $print: String = {
- $read.INSTANCE.$iw.$iw
- (""
- + "sc: spark.this.SparkContext = " + scala.runtime.ScalaRunTime.replStringOf($line3.$read.INSTANCE.$iw.$iw.`sc`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line3.$read.INSTANCE.$iw.$iw.sc;
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".$plus("sc: spark.this.SparkContext = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line3.$read.INSTANCE.$iw.$iw.sc, 1000))
- }
- }))
- 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line3.$read.INSTANCE.$iw.$iw.sc;
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".+("sc: spark.this.SparkContext = ").+(scala.runtime.ScalaRunTime.replStringOf($line3.$read.INSTANCE.$iw.$iw.sc, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line3.$eval.$print()
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- var value: java.lang.Throwable = _
- def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- <mutable> <defaultinit> var value: java.lang.Throwable = _;
- def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- <mutable> <defaultinit> var value: java.lang.Throwable = _;
- def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line4.$eval.set(java.lang.Object)
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: with args: java.lang.NullPointerException
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val lastException = $line4.$eval.value
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val lastException = $line4.$eval.value
- ") Some(List(val lastException = $line4.$eval.value))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
- 11: TypeTree
- 40: Select
- 34: Select
- 27: Ident
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- val lastException = $line4.$eval.value
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val lastException = $line4.$eval.value
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val lastException = $line4.$eval.value
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of lastException to <method> <stable> <accessor> val lastException(): Throwable
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line5.$read.INSTANCE.$iw.$iw.`lastException`
- val $print: String = {
- $read.INSTANCE.$iw.$iw
- (""
- + "lastException: lang.this.Throwable = " + scala.runtime.ScalaRunTime.replStringOf($line5.$read.INSTANCE.$iw.$iw.`lastException`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line5.$read.INSTANCE.$iw.$iw.lastException;
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".$plus("lastException: lang.this.Throwable = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line5.$read.INSTANCE.$iw.$iw.lastException, 1000))
- }
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line5.$read.INSTANCE.$iw.$iw.lastException;
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".+("lastException: lang.this.Throwable = ").+(scala.runtime.ScalaRunTime.replStringOf($line5.$read.INSTANCE.$iw.$iw.lastException, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line5.$eval.$print()
- java.lang.NullPointerException
- at $iwC$$iwC.<init>(<console>:8)
- at $iwC.<init>(<console>:14)
- at <init>(<console>:16)
- at .<init>(<console>:20)
- at .<clinit>(<console>)
- at .<init>(<console>:7)
- at .<clinit>(<console>)
- at $print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:606)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:772)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1040)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:609)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:640)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:604)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:788)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:833)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:745)
- at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:119)
- at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:118)
- at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:258)
- at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:118)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:53)
- at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:903)
- at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:140)
- at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:53)
- at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:102)
- at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:53)
- at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:920)
- at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:876)
- at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:876)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:876)
- at com.wellmo.reporting.Repl$.run(Repl.scala:31)
- at com.wellmo.reporting.WellmoReporting$.run(WellmoReporting.scala:71)
- at com.wellmo.reporting.WellmoReporting$$anonfun$main$1.apply(WellmoReporting.scala:49)
- at com.wellmo.reporting.WellmoReporting$$anonfun$main$1.apply(WellmoReporting.scala:48)
- at scala.Option.map(Option.scala:145)
- at com.wellmo.reporting.WellmoReporting$.main(WellmoReporting.scala:48)
- at com.wellmo.reporting.WellmoReporting.main(WellmoReporting.scala)
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" import org.apache.spark.SparkContext._
- ") Some(List(import org.apache.spark.SparkContext._))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 14: Import
- 31: Select
- 25: Select
- 18: Select
- 14: Ident
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- val $print: String = {
- $read.INSTANCE.$iw.$iw
- (""
- + "import org.apache.spark.SparkContext._" + "\u000A"
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".$plus("import org.apache.spark.SparkContext._").$plus("\n")
- }
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val $print: String = {
- $read.INSTANCE.$iw.$iw;
- "".+("import org.apache.spark.SparkContext._").+("\n")
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line6.$eval.$print()
- Spark context available as sc.
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- var value: org.apache.spark.SparkContext = _
- def set(x: Any) = value = x.asInstanceOf[org.apache.spark.SparkContext]
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- <mutable> <defaultinit> var value: org.apache.spark.SparkContext = _;
- def set(x: Any) = value = x.asInstanceOf[org.apache.spark.SparkContext]
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- <mutable> <defaultinit> var value: org.apache.spark.SparkContext = _;
- def set(x: Any) = value = x.asInstanceOf[org.apache.spark.SparkContext]
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line7.$eval.set(java.lang.Object)
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: with args: org.apache.spark.SparkContext@6345a68f
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val sc = $line7.$eval.value
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val sc = $line7.$eval.value
- ") Some(List(val sc = $line7.$eval.value))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
- 11: TypeTree
- 29: Select
- 23: Select
- 16: Ident
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" def $ires0 = {
- org.apache.spark.SparkContext
- }
- ") Some(List(def $ires0 = org.apache.spark.SparkContext))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: DefDef
- 11: TypeTree
- 46: Select
- 40: Select
- 33: Select
- 29: Ident
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" def $ires1 = {
- org.apache.spark.SparkContext
- }
- ") Some(List(def $ires1 = org.apache.spark.SparkContext))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: DefDef
- 11: TypeTree
- 46: Select
- 40: Select
- 33: Select
- 29: Ident
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" def $ires2 = {
- org.apache.spark.SparkContext
- }
- ") Some(List(def $ires2 = org.apache.spark.SparkContext))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: DefDef
- 11: TypeTree
- 46: Select
- 40: Select
- 33: Select
- 29: Ident
- 14/03/06 11:13:08 DEBUG SparkIMain$exprTyper: Terminating typeOfExpression recursion for expression: org.apache.spark.SparkContext
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- def $ires2 = {
- org.apache.spark.SparkContext
- }
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- def $ires2 = org.apache.spark.SparkContext
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- def $ires2 = org.apache.spark.SparkContext
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $ires2 to <method> def $ires2(): org.apache.spark.SparkContext.type
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line11.$read.INSTANCE.$iw.$iw.$iw.$iw.`$ires2`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "$ires2" + ": " + "<root>.this.org.apache.spark.SparkContext.type" + "\u000A"
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line11.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires2;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("$ires2").$plus(": ").$plus("<root>.this.org.apache.spark.SparkContext.type").$plus("\n")
- }
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line11.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires2;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("$ires2").+(": ").+("<root>.this.org.apache.spark.SparkContext.type").+("\n")
- }
- }
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line11.$eval.$print()
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- def $ires1 = {
- org.apache.spark.SparkContext
- }
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- def $ires1 = org.apache.spark.SparkContext
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- def $ires1 = org.apache.spark.SparkContext
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $ires1 to <method> def $ires1(): org.apache.spark.SparkContext.type
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line10.$read.INSTANCE.$iw.$iw.$iw.$iw.`$ires1`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "$ires1" + ": " + "<root>.this.org.apache.spark.SparkContext.type" + "\u000A"
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line10.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires1;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("$ires1").$plus(": ").$plus("<root>.this.org.apache.spark.SparkContext.type").$plus("\n")
- }
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line10.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires1;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("$ires1").+(": ").+("<root>.this.org.apache.spark.SparkContext.type").+("\n")
- }
- }
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line10.$eval.$print()
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- def $ires0 = {
- org.apache.spark.SparkContext
- }
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- def $ires0 = org.apache.spark.SparkContext
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- def $ires0 = org.apache.spark.SparkContext
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $ires0 to <method> def $ires0(): org.apache.spark.SparkContext.type
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line9.$read.INSTANCE.$iw.$iw.$iw.$iw.`$ires0`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "$ires0" + ": " + "<root>.this.org.apache.spark.SparkContext.type" + "\u000A"
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line9.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires0;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("$ires0").$plus(": ").$plus("<root>.this.org.apache.spark.SparkContext.type").$plus("\n")
- }
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line9.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires0;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("$ires0").+(": ").+("<root>.this.org.apache.spark.SparkContext.type").+("\n")
- }
- }
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line9.$eval.$print()
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- val sc = $line7.$eval.value
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val sc = $line7.$eval.value
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val sc = $line7.$eval.value
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of sc to <method> <stable> <accessor> val sc(): org.apache.spark.SparkContext
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line8.$read.INSTANCE.$iw.$iw.$iw.$iw.`sc`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "sc: spark.this.SparkContext = " + scala.runtime.ScalaRunTime.replStringOf($line8.$read.INSTANCE.$iw.$iw.$iw.$iw.`sc`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("sc: spark.this.SparkContext = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc, 1000))
- }
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("sc: spark.this.SparkContext = ").+(scala.runtime.ScalaRunTime.replStringOf($line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line8.$eval.$print()
- sc: spark.this.SparkContext = org.apache.spark.SparkContext@6345a68f
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- var value: com.wellmo.reporting.WellmoData = _
- def set(x: Any) = value = x.asInstanceOf[com.wellmo.reporting.WellmoData]
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- <mutable> <defaultinit> var value: com.wellmo.reporting.WellmoData = _;
- def set(x: Any) = value = x.asInstanceOf[com.wellmo.reporting.WellmoData]
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- <mutable> <defaultinit> var value: com.wellmo.reporting.WellmoData = _;
- def set(x: Any) = value = x.asInstanceOf[com.wellmo.reporting.WellmoData]
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line12.$eval.set(java.lang.Object)
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: with args: com.wellmo.reporting.WellmoData@68362b9b
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val data = $line12.$eval.value
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val data = $line12.$eval.value
- ") Some(List(val data = $line12.$eval.value))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
- 11: TypeTree
- 32: Select
- 26: Select
- 18: Ident
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- val data = $line12.$eval.value
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val data = $line12.$eval.value
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val data = $line12.$eval.value
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of data to <method> <stable> <accessor> val data(): com.wellmo.reporting.WellmoData
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line13.$read.INSTANCE.$iw.$iw.$iw.$iw.`data`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "data: reporting.this.WellmoData = " + scala.runtime.ScalaRunTime.replStringOf($line13.$read.INSTANCE.$iw.$iw.$iw.$iw.`data`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("data: reporting.this.WellmoData = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data, 1000))
- }
- }))
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("data: reporting.this.WellmoData = ").+(scala.runtime.ScalaRunTime.replStringOf($line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line13.$eval.$print()
- data: reporting.this.WellmoData = com.wellmo.reporting.WellmoData@68362b9b
- scala> data.profile.count()
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse(" data.profile.count()
- ") Some(List(data.profile.count()))
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: 25: Apply
- 20: Select
- 12: Select
- 7: Ident
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val res0 =
- data.profile.count()
- ") Some(List(val res0 = data.profile.count()))
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
- 11: TypeTree
- 50: Apply
- 45: Select
- 37: Select
- 32: Ident
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- val $VAL1 = $line13.$read.INSTANCE;
- import $VAL1.$iw.$iw.$iw.$iw.`data`;
- class $iwC extends Serializable {
- val res0 =
- data.profile.count()
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val $VAL1 = $line13.$read.INSTANCE;
- import $VAL1.$iw.$iw.$iw.$iw.data;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val res0 = data.profile.count()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val $VAL1 = $line13.$read.INSTANCE;
- import $VAL1.$iw.$iw.$iw.$iw.data;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val res0 = data.profile.count
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of res0 to <method> <stable> <accessor> val res0(): Long
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line14.$read.INSTANCE.$iw.$iw.$iw.$iw.`res0`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "res0: scala.this.Long = " + scala.runtime.ScalaRunTime.replStringOf($line14.$read.INSTANCE.$iw.$iw.$iw.$iw.`res0`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("res0: scala.this.Long = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0, 1000))
- }
- }))
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("res0: scala.this.Long = ").+(scala.runtime.ScalaRunTime.replStringOf($line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line14.$eval.$print()
- 14/03/06 11:13:15 INFO SparkContext: Starting job: count at <console>:13
- 14/03/06 11:13:15 INFO DAGScheduler: Got job 0 (count at <console>:13) with 1 output partitions (allowLocal=false)
- 14/03/06 11:13:15 INFO DAGScheduler: Final stage: Stage 0 (count at <console>:13)
- 14/03/06 11:13:15 INFO DAGScheduler: Parents of final stage: List()
- 14/03/06 11:13:15 INFO DAGScheduler: Missing parents: List()
- 14/03/06 11:13:15 DEBUG DAGScheduler: submitStage(Stage 0)
- 14/03/06 11:13:15 DEBUG DAGScheduler: missing: List()
- 14/03/06 11:13:15 INFO DAGScheduler: Submitting Stage 0 (FilteredRDD[6] at filter at Mongo.scala:28), which has no missing parents
- 14/03/06 11:13:15 DEBUG DAGScheduler: submitMissingTasks(Stage 0)
- 14/03/06 11:13:15 INFO DAGScheduler: Submitting 1 missing tasks from Stage 0 (FilteredRDD[6] at filter at Mongo.scala:28)
- 14/03/06 11:13:15 DEBUG DAGScheduler: New pending tasks: Set(ResultTask(0, 0))
- 14/03/06 11:13:15 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
- 14/03/06 11:13:15 DEBUG TaskSetManager: Epoch for TaskSet 0.0: 0
- 14/03/06 11:13:15 DEBUG TaskSetManager: Valid locality levels for TaskSet 0.0: ANY
- 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
- 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
- 14/03/06 11:13:15 INFO TaskSetManager: Starting task 0.0:0 as TID 0 on executor localhost: localhost (PROCESS_LOCAL)
- 14/03/06 11:13:15 INFO TaskSetManager: Serialized task 0.0:0 as 1798 bytes in 5 ms
- 14/03/06 11:13:15 INFO Executor: Running task ID 0
- 14/03/06 11:13:15 DEBUG BlockManager: Getting local block broadcast_2
- 14/03/06 11:13:15 DEBUG BlockManager: Level for block broadcast_2 is StorageLevel(true, true, true, 1)
- 14/03/06 11:13:15 DEBUG BlockManager: Getting block broadcast_2 from memory
- 14/03/06 11:13:15 INFO BlockManager: Found block broadcast_2 locally
- 14/03/06 11:13:15 DEBUG Executor: Task 0's epoch is 0
- 14/03/06 11:13:15 DEBUG CacheManager: Looking for partition rdd_6_0
- 14/03/06 11:13:15 DEBUG BlockManager: Getting local block rdd_6_0
- 14/03/06 11:13:15 DEBUG BlockManager: Block rdd_6_0 not registered locally
- 14/03/06 11:13:15 DEBUG BlockManager: Getting remote block rdd_6_0
- 14/03/06 11:13:15 DEBUG BlockManager: Block rdd_6_0 not found
- 14/03/06 11:13:15 INFO CacheManager: Partition rdd_6_0 not found, computing it
- 14/03/06 11:13:15 INFO NewHadoopRDD: Input split: MongoInputSplit{URI=mongodb://127.0.0.1:27017/wellness.Profile, authURI=null, min={ }, max={ }, query={ }, sort={ }, fields={ }, notimeout=false}
- 14/03/06 11:13:15 INFO MongoRecordReader: Read 2.0 documents from:
- 14/03/06 11:13:15 INFO MongoRecordReader: MongoInputSplit{URI=mongodb://127.0.0.1:27017/wellness.Profile, authURI=null, min={ }, max={ }, query={ }, sort={ }, fields={ }, notimeout=false}
- 14/03/06 11:13:15 INFO MemoryStore: ensureFreeSpace(536) called with curMem=422572, maxMem=2202324172
- 14/03/06 11:13:15 INFO MemoryStore: Block rdd_6_0 stored as values to memory (estimated size 536.0 B, free 2.1 GB)
- 14/03/06 11:13:15 INFO BlockManagerMasterActor$BlockManagerInfo: Added rdd_6_0 in memory on 172.16.168.1:43022 (size: 536.0 B, free: 2.1 GB)
- 14/03/06 11:13:15 INFO BlockManagerMaster: Updated info of block rdd_6_0
- 14/03/06 11:13:15 DEBUG BlockManager: Told master about block rdd_6_0
- 14/03/06 11:13:15 DEBUG BlockManager: Put block rdd_6_0 locally took 4 ms
- 14/03/06 11:13:15 DEBUG BlockManager: Put for block rdd_6_0 without replication took 5 ms
- 14/03/06 11:13:15 INFO Executor: Serialized size of result for 0 is 563
- 14/03/06 11:13:15 INFO Executor: Sending result for 0 directly to driver
- 14/03/06 11:13:15 INFO Executor: Finished task ID 0
- 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
- 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
- 14/03/06 11:13:15 INFO TaskSetManager: Finished TID 0 in 69 ms on localhost (progress: 0/1)
- 14/03/06 11:13:15 INFO TaskSchedulerImpl: Remove TaskSet 0.0 from pool
- 14/03/06 11:13:15 INFO DAGScheduler: Completed ResultTask(0, 0)
- 14/03/06 11:13:15 INFO DAGScheduler: Stage 0 (count at <console>:13) finished in 0.075 s
- 14/03/06 11:13:15 DEBUG DAGScheduler: After removal of stage 0, remaining stages = 0
- 14/03/06 11:13:15 INFO SparkContext: Job finished: count at <console>:13, took 0.160291812 s
- res0: scala.this.Long = 1
- scala> data.profile.filter(p => p.email == "test@example.org").count()
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse(" data.profile.filter(p => p.email == "test@example.org").count()
- ") Some(List(data.profile.filter(((p) => p.email.$eq$eq("test@example.org"))).count()))
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: 68: Apply
- 63: Select
- 26: Apply
- 20: Select
- 12: Select
- 7: Ident
- 29: Function
- 27: ValDef
- 27: TypeTree
- -1: EmptyTree
- 40: Apply
- 40: Select
- 34: Select
- 32: Ident
- 43: Literal
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val res1 =
- data.profile.filter(p => p.email == "test@example.org").count()
- ") Some(List(val res1 = data.profile.filter(((p) => p.email.$eq$eq("test@example.org"))).count()))
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
- 11: TypeTree
- 93: Apply
- 88: Select
- 51: Apply
- 45: Select
- 37: Select
- 32: Ident
- 54: Function
- 52: ValDef
- 52: TypeTree
- -1: EmptyTree
- 65: Apply
- 65: Select
- 59: Select
- 57: Ident
- 68: Literal
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- val $VAL2 = $line13.$read.INSTANCE;
- import $VAL2.$iw.$iw.$iw.$iw.`data`;
- class $iwC extends Serializable {
- val res1 =
- data.profile.filter(p => p.email == "test@example.org").count()
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val $VAL2 = $line13.$read.INSTANCE;
- import $VAL2.$iw.$iw.$iw.$iw.data;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val res1 = data.profile.filter(((p) => p.email.$eq$eq("test@example.org"))).count()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val $VAL2 = $line13.$read.INSTANCE;
- import $VAL2.$iw.$iw.$iw.$iw.data;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val res1 = data.profile.filter(((p) => p.email.==("test@example.org"))).count
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of res1 to <method> <stable> <accessor> val res1(): Long
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line15.$read.INSTANCE.$iw.$iw.$iw.$iw.`res1`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "res1: scala.this.Long = " + scala.runtime.ScalaRunTime.replStringOf($line15.$read.INSTANCE.$iw.$iw.$iw.$iw.`res1`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("res1: scala.this.Long = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1, 1000))
- }
- }))
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("res1: scala.this.Long = ").+(scala.runtime.ScalaRunTime.replStringOf($line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line15.$eval.$print()
- 14/03/06 11:13:21 INFO SparkContext: Starting job: count at <console>:13
- 14/03/06 11:13:21 INFO DAGScheduler: Got job 1 (count at <console>:13) with 1 output partitions (allowLocal=false)
- 14/03/06 11:13:21 INFO DAGScheduler: Final stage: Stage 1 (count at <console>:13)
- 14/03/06 11:13:21 INFO DAGScheduler: Parents of final stage: List()
- 14/03/06 11:13:21 INFO DAGScheduler: Missing parents: List()
- 14/03/06 11:13:21 DEBUG DAGScheduler: submitStage(Stage 1)
- 14/03/06 11:13:21 DEBUG DAGScheduler: missing: List()
- 14/03/06 11:13:21 INFO DAGScheduler: Submitting Stage 1 (FilteredRDD[32] at filter at <console>:13), which has no missing parents
- 14/03/06 11:13:21 DEBUG DAGScheduler: submitMissingTasks(Stage 1)
- 14/03/06 11:13:21 INFO DAGScheduler: Submitting 1 missing tasks from Stage 1 (FilteredRDD[32] at filter at <console>:13)
- 14/03/06 11:13:21 DEBUG DAGScheduler: New pending tasks: Set(ResultTask(1, 0))
- 14/03/06 11:13:21 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
- 14/03/06 11:13:21 DEBUG TaskSetManager: Epoch for TaskSet 1.0: 0
- 14/03/06 11:13:21 DEBUG TaskSetManager: Valid locality levels for TaskSet 1.0: ANY
- 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
- 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
- 14/03/06 11:13:21 INFO TaskSetManager: Starting task 1.0:0 as TID 1 on executor localhost: localhost (PROCESS_LOCAL)
- 14/03/06 11:13:21 INFO TaskSetManager: Serialized task 1.0:0 as 1833 bytes in 1 ms
- 14/03/06 11:13:21 INFO Executor: Running task ID 1
- 14/03/06 11:13:21 DEBUG BlockManager: Getting local block broadcast_2
- 14/03/06 11:13:21 DEBUG BlockManager: Level for block broadcast_2 is StorageLevel(true, true, true, 1)
- 14/03/06 11:13:21 DEBUG BlockManager: Getting block broadcast_2 from memory
- 14/03/06 11:13:21 INFO BlockManager: Found block broadcast_2 locally
- 14/03/06 11:13:21 ERROR Executor: Exception in task ID 1
- java.lang.ClassNotFoundException: $line15.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1
- at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
- at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
- at java.security.AccessController.doPrivileged(Native Method)
- at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
- at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
- at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
- at java.lang.Class.forName0(Native Method)
- at java.lang.Class.forName(Class.java:270)
- at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:37)
- at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
- at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
- at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:63)
- at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:139)
- at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
- at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:62)
- at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:195)
- at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:49)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:744)
- 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
- 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
- 14/03/06 11:13:21 WARN TaskSetManager: Lost TID 1 (task 1.0:0)
- 14/03/06 11:13:21 WARN TaskSetManager: Loss was due to java.lang.ClassNotFoundException
- java.lang.ClassNotFoundException: $line15.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1
- at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
- at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
- at java.security.AccessController.doPrivileged(Native Method)
- at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
- at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
- at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
- at java.lang.Class.forName0(Native Method)
- at java.lang.Class.forName(Class.java:270)
- at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:37)
- at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
- at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
- at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:63)
- at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:139)
- at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
- at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:62)
- at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:195)
- at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:49)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:744)
- 14/03/06 11:13:21 ERROR TaskSetManager: Task 1.0:0 failed 1 times; aborting job
- 14/03/06 11:13:21 INFO TaskSchedulerImpl: Remove TaskSet 1.0 from pool
- 14/03/06 11:13:21 DEBUG DAGScheduler: Removing running stage 1
- 14/03/06 11:13:21 INFO DAGScheduler: Failed to run count at <console>:13
- 14/03/06 11:13:21 DEBUG DAGScheduler: Removing pending status for stage 1
- 14/03/06 11:13:21 DEBUG DAGScheduler: After removal of stage 1, remaining stages = 0
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- var value: java.lang.Throwable = _
- def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- <mutable> <defaultinit> var value: java.lang.Throwable = _;
- def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
- }))
- 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- <mutable> <defaultinit> var value: java.lang.Throwable = _;
- def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line16.$eval.set(java.lang.Object)
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: with args: org.apache.spark.SparkException: Job aborted: Task 1.0:0 failed 1 times (most recent failure: Exception failure: java.lang.ClassNotFoundException: $line15.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1)
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val lastException = $line16.$eval.value
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val lastException = $line16.$eval.value
- ") Some(List(val lastException = $line16.$eval.value))
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
- 11: TypeTree
- 41: Select
- 35: Select
- 27: Ident
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- class $read extends Serializable {
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- import org.apache.spark.SparkContext._
- class $iwC extends Serializable {
- class $iwC extends Serializable {
- val lastException = $line16.$eval.value
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- val $iw = new $iwC;
- }
- object $read {
- val INSTANCE = new $read();
- }
- ") Some(List(class $read extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>();
- ()
- };
- val lastException = $line16.$eval.value
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- };
- val $iw = new $iwC.<init>()
- }, object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- val INSTANCE = new $read.<init>()
- }))
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- import org.apache.spark.SparkContext._;
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- class $iwC extends Serializable {
- def <init>() = {
- super.<init>;
- ()
- };
- val lastException = $line16.$eval.value
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- };
- val $iw = new $iwC.<init>
- }
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- val INSTANCE = new $read.<init>
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of lastException to <method> <stable> <accessor> val lastException(): Throwable
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: parse("
- object $eval {
- lazy val $result = $line17.$read.INSTANCE.$iw.$iw.$iw.$iw.`lastException`
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw
- (""
- + "lastException: lang.this.Throwable = " + scala.runtime.ScalaRunTime.replStringOf($line17.$read.INSTANCE.$iw.$iw.$iw.$iw.`lastException`, 1000)
- )
- }
- }
- ") Some(List(object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>();
- ()
- };
- lazy val $result = $line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".$plus("lastException: lang.this.Throwable = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException, 1000))
- }
- }))
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
- def <init>() = {
- super.<init>;
- ()
- };
- lazy val $result = $line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException;
- val $print: String = {
- $read.INSTANCE.$iw.$iw.$iw.$iw;
- "".+("lastException: lang.this.Throwable = ").+(scala.runtime.ScalaRunTime.replStringOf($line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException, 1000))
- }
- }
- [running phase parser on <console>]
- [running phase namer on <console>]
- [running phase packageobjects on <console>]
- [running phase typer on <console>]
- [running phase patmat on <console>]
- [running phase repl on <console>]
- [running phase superaccessors on <console>]
- [running phase extmethods on <console>]
- [running phase pickler on <console>]
- [running phase refchecks on <console>]
- [running phase uncurry on <console>]
- [running phase tailcalls on <console>]
- [running phase specialize on <console>]
- [running phase explicitouter on <console>]
- [running phase erasure on <console>]
- [running phase posterasure on <console>]
- [running phase lazyvals on <console>]
- [running phase lambdalift on <console>]
- [running phase constructors on <console>]
- [running phase flatten on <console>]
- [running phase mixin on <console>]
- [running phase cleanup on <console>]
- [running phase icode on <console>]
- [running phase inliner on <console>]
- [running phase inlineExceptionHandlers on <console>]
- [running phase closelim on <console>]
- [running phase dce on <console>]
- [running phase jvm on icode]
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line17.$eval.$print()
- 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Redefining term 'lastException'
- lang.this.Throwable -> lang.this.Throwable
- org.apache.spark.SparkException: Job aborted: Task 1.0:0 failed 1 times (most recent failure: Exception failure: java.lang.ClassNotFoundException: $iwC$$iwC$$iwC$$iwC$$anonfun$1)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1028)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1026)
- at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
- at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1026)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:619)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:619)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:619)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:207)
- at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
- at akka.actor.ActorCell.invoke(ActorCell.scala:456)
- at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
- at akka.dispatch.Mailbox.run(Mailbox.scala:219)
- at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
- at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
- at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
- at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
- at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
- scala>
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement