Advertisement
plaa

Spark console logging

Mar 6th, 2014
99
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 103.75 KB | None | 0 0
  1. work@sampo-think ~/workspace/wellmo-reporting $ java -jar target/scala-2.10/wellmo-reporting-assembly-1.0.jar interactive
  2. 14/03/06 11:13:02 WARN Utils: Your hostname, sampo-think resolves to a loopback address: 127.0.1.1; using 172.16.168.1 instead (on interface vmnet8)
  3. 14/03/06 11:13:02 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
  4. 14/03/06 11:13:03 INFO Slf4jLogger: Slf4jLogger started
  5. 14/03/06 11:13:03 INFO Remoting: Starting remoting
  6. 14/03/06 11:13:03 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@172.16.168.1:33053]
  7. 14/03/06 11:13:03 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@172.16.168.1:33053]
  8. 14/03/06 11:13:03 INFO SparkEnv: Registering BlockManagerMaster
  9. 14/03/06 11:13:03 DEBUG DiskBlockManager: Creating local directories at root dirs '/tmp'
  10. 14/03/06 11:13:03 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20140306111303-ae28
  11. 14/03/06 11:13:03 INFO MemoryStore: MemoryStore started with capacity 2.1 GB.
  12. 14/03/06 11:13:03 INFO ConnectionManager: Bound socket to port 43022 with id = ConnectionManagerId(172.16.168.1,43022)
  13. 14/03/06 11:13:03 INFO BlockManagerMaster: Trying to register BlockManager
  14. 14/03/06 11:13:03 INFO BlockManagerMasterActor$BlockManagerInfo: Registering block manager 172.16.168.1:43022 with 2.1 GB RAM
  15. 14/03/06 11:13:03 INFO BlockManagerMaster: Registered BlockManager
  16. 14/03/06 11:13:03 INFO HttpServer: Starting HTTP Server
  17. 14/03/06 11:13:03 INFO HttpBroadcast: Broadcast server started at http://172.16.168.1:56274
  18. 14/03/06 11:13:03 INFO SparkEnv: Registering MapOutputTracker
  19. 14/03/06 11:13:03 INFO HttpFileServer: HTTP File server directory is /tmp/spark-551a7446-963c-485c-911a-191c90ddf384
  20. 14/03/06 11:13:03 INFO HttpServer: Starting HTTP Server
  21. 14/03/06 11:13:03 INFO SparkUI: Started Spark Web UI at http://172.16.168.1:4040
  22. 14/03/06 11:13:03 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
  23. 14/03/06 11:13:03 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
  24. 14/03/06 11:13:03 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
  25. 14/03/06 11:13:03 DEBUG KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
  26. 14/03/06 11:13:03 DEBUG Groups: Creating new Groups object
  27. 14/03/06 11:13:03 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
  28. 14/03/06 11:13:03 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
  29. 14/03/06 11:13:03 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
  30. 14/03/06 11:13:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  31. 14/03/06 11:13:03 DEBUG JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
  32. 14/03/06 11:13:03 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
  33. 14/03/06 11:13:03 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
  34. 14/03/06 11:13:04 INFO MemoryStore: ensureFreeSpace(140816) called with curMem=0, maxMem=2202324172
  35. 14/03/06 11:13:04 INFO MemoryStore: Block broadcast_0 stored as values to memory (estimated size 137.5 KB, free 2.1 GB)
  36. 14/03/06 11:13:04 DEBUG BlockManager: Put block broadcast_0 locally took 74 ms
  37. 14/03/06 11:13:04 DEBUG BlockManager: Put for block broadcast_0 without replication took 74 ms
  38. 14/03/06 11:13:04 INFO MemoryStore: ensureFreeSpace(140878) called with curMem=140816, maxMem=2202324172
  39. 14/03/06 11:13:04 INFO MemoryStore: Block broadcast_1 stored as values to memory (estimated size 137.6 KB, free 2.1 GB)
  40. 14/03/06 11:13:04 DEBUG BlockManager: Put block broadcast_1 locally took 12 ms
  41. 14/03/06 11:13:04 DEBUG BlockManager: Put for block broadcast_1 without replication took 12 ms
  42. 14/03/06 11:13:04 INFO MemoryStore: ensureFreeSpace(140878) called with curMem=281694, maxMem=2202324172
  43. 14/03/06 11:13:04 INFO MemoryStore: Block broadcast_2 stored as values to memory (estimated size 137.6 KB, free 2.1 GB)
  44. 14/03/06 11:13:04 DEBUG BlockManager: Put block broadcast_2 locally took 13 ms
  45. 14/03/06 11:13:04 DEBUG BlockManager: Put for block broadcast_2 without replication took 13 ms
  46. 14/03/06 11:13:04 DEBUG UserGroupInformation: hadoop login
  47. 14/03/06 11:13:04 DEBUG UserGroupInformation: hadoop login commit
  48. 14/03/06 11:13:04 DEBUG UserGroupInformation: using local user:UnixPrincipal: work
  49. 14/03/06 11:13:04 DEBUG UserGroupInformation: UGI loginUser:work (auth:SIMPLE)
  50. 14/03/06 11:13:04 INFO MongoInputFormat: Using com.mongodb.hadoop.splitter.StandaloneMongoSplitter@50e3fa1d to calculate splits.
  51. 14/03/06 11:13:04 INFO StandaloneMongoSplitter: Running splitvector to check splits against mongodb://127.0.0.1:27017/wellness.Data
  52. 14/03/06 11:13:04 WARN StandaloneMongoSplitter: WARNING: No Input Splits were calculated by the split code. Proceeding with a *single* split. Data may be too small, try lowering 'mongo.input.split_size' if this is undesirable.
  53. 14/03/06 11:13:04 INFO MongoCollectionSplitter: Created split: min=null, max= null
  54. 14/03/06 11:13:04 INFO MongoInputFormat: Using com.mongodb.hadoop.splitter.StandaloneMongoSplitter@4cdae43e to calculate splits.
  55. 14/03/06 11:13:04 INFO StandaloneMongoSplitter: Running splitvector to check splits against mongodb://127.0.0.1:27017/wellness.Profile
  56. 14/03/06 11:13:04 WARN StandaloneMongoSplitter: WARNING: No Input Splits were calculated by the split code. Proceeding with a *single* split. Data may be too small, try lowering 'mongo.input.split_size' if this is undesirable.
  57. 14/03/06 11:13:04 INFO MongoCollectionSplitter: Created split: min=null, max= null
  58. 14/03/06 11:13:04 INFO MongoInputFormat: Using com.mongodb.hadoop.splitter.StandaloneMongoSplitter@220bd7cb to calculate splits.
  59. 14/03/06 11:13:04 INFO StandaloneMongoSplitter: Running splitvector to check splits against mongodb://127.0.0.1:27017/wellness.Analytics
  60. 14/03/06 11:13:04 WARN StandaloneMongoSplitter: WARNING: No Input Splits were calculated by the split code. Proceeding with a *single* split. Data may be too small, try lowering 'mongo.input.split_size' if this is undesirable.
  61. 14/03/06 11:13:04 INFO MongoCollectionSplitter: Created split: min=null, max= null
  62. 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[7] at map at Mongo.scala:30
  63. 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[8] at map at Mongo.scala:32
  64. 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[7] at map at Mongo.scala:30
  65. 14/03/06 11:13:04 DEBUG CoGroupedRDD: Adding shuffle dependency with MappedRDD[13] at map at Mongo.scala:33
  66. Starting the interactive shell
  67. 14/03/06 11:13:04 INFO HttpServer: Starting HTTP Server
  68. [running phase parser on <init>]
  69. [running phase namer on <init>]
  70. [running phase packageobjects on <init>]
  71. [running phase typer on <init>]
  72. [running phase patmat on <init>]
  73. [running phase repl on <init>]
  74. [running phase superaccessors on <init>]
  75. [running phase extmethods on <init>]
  76. [running phase pickler on <init>]
  77. [running phase refchecks on <init>]
  78. [running phase uncurry on <init>]
  79. [running phase tailcalls on <init>]
  80. [running phase specialize on <init>]
  81. [running phase explicitouter on <init>]
  82. [running phase erasure on <init>]
  83. [running phase posterasure on <init>]
  84. [running phase lazyvals on <init>]
  85. [running phase lambdalift on <init>]
  86. [running phase constructors on <init>]
  87. [running phase flatten on <init>]
  88. [running phase mixin on <init>]
  89. [running phase cleanup on <init>]
  90. [running phase icode on <init>]
  91. [running phase inliner on <init>]
  92. [running phase inlineExceptionHandlers on <init>]
  93. [running phase closelim on <init>]
  94. [running phase dce on <init>]
  95. [running phase jvm on icode]
  96. 14/03/06 11:13:06 DEBUG Repl$$anon$1: Clearing 6 thunks.
  97. 14/03/06 11:13:06 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  98. object $eval {
  99. var value: org.apache.spark.repl.SparkIMain = _
  100. def set(x: Any) = value = x.asInstanceOf[org.apache.spark.repl.SparkIMain]
  101. }
  102. ") Some(List(object $eval extends scala.AnyRef {
  103. def <init>() = {
  104. super.<init>();
  105. ()
  106. };
  107. <mutable> <defaultinit> var value: org.apache.spark.repl.SparkIMain = _;
  108. def set(x: Any) = value = x.asInstanceOf[org.apache.spark.repl.SparkIMain]
  109. }))
  110. 14/03/06 11:13:06 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  111. def <init>() = {
  112. super.<init>;
  113. ()
  114. };
  115. <mutable> <defaultinit> var value: org.apache.spark.repl.SparkIMain = _;
  116. def set(x: Any) = value = x.asInstanceOf[org.apache.spark.repl.SparkIMain]
  117. }
  118. [running phase parser on <console>]
  119. [running phase namer on <console>]
  120. [running phase packageobjects on <console>]
  121. [running phase typer on <console>]
  122. [running phase patmat on <console>]
  123. [running phase repl on <console>]
  124. [running phase superaccessors on <console>]
  125. [running phase extmethods on <console>]
  126. [running phase pickler on <console>]
  127. [running phase refchecks on <console>]
  128. [running phase uncurry on <console>]
  129. [running phase tailcalls on <console>]
  130. [running phase specialize on <console>]
  131. [running phase explicitouter on <console>]
  132. [running phase erasure on <console>]
  133. [running phase posterasure on <console>]
  134. [running phase lazyvals on <console>]
  135. [running phase lambdalift on <console>]
  136. [running phase constructors on <console>]
  137. [running phase flatten on <console>]
  138. [running phase mixin on <console>]
  139. [running phase cleanup on <console>]
  140. [running phase icode on <console>]
  141. [running phase inliner on <console>]
  142. [running phase inlineExceptionHandlers on <console>]
  143. [running phase closelim on <console>]
  144. [running phase dce on <console>]
  145. [running phase jvm on icode]
  146. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line1.$eval.set(java.lang.Object)
  147. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: with args: org.apache.spark.repl.SparkILoop$SparkILoopInterpreter@6afd37be
  148. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val $intp = $line1.$eval.value
  149. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val $intp = $line1.$eval.value
  150. ") Some(List(val $intp = $line1.$eval.value))
  151. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
  152. 11: TypeTree
  153. 32: Select
  154. 26: Select
  155. 19: Ident
  156.  
  157. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  158. class $read extends Serializable {
  159. class $iwC extends Serializable {
  160. class $iwC extends Serializable {
  161. val $intp = $line1.$eval.value
  162.  
  163.  
  164.  
  165. }
  166. val $iw = new $iwC;
  167. }
  168. val $iw = new $iwC;
  169.  
  170. }
  171. object $read {
  172. val INSTANCE = new $read();
  173. }
  174.  
  175. ") Some(List(class $read extends Serializable {
  176. def <init>() = {
  177. super.<init>();
  178. ()
  179. };
  180. class $iwC extends Serializable {
  181. def <init>() = {
  182. super.<init>();
  183. ()
  184. };
  185. class $iwC extends Serializable {
  186. def <init>() = {
  187. super.<init>();
  188. ()
  189. };
  190. val $intp = $line1.$eval.value
  191. };
  192. val $iw = new $iwC.<init>()
  193. };
  194. val $iw = new $iwC.<init>()
  195. }, object $read extends scala.AnyRef {
  196. def <init>() = {
  197. super.<init>();
  198. ()
  199. };
  200. val INSTANCE = new $read.<init>()
  201. }))
  202. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  203. def <init>() = {
  204. super.<init>;
  205. ()
  206. };
  207. class $iwC extends Serializable {
  208. def <init>() = {
  209. super.<init>;
  210. ()
  211. };
  212. class $iwC extends Serializable {
  213. def <init>() = {
  214. super.<init>;
  215. ()
  216. };
  217. val $intp = $line1.$eval.value
  218. };
  219. val $iw = new $iwC.<init>
  220. };
  221. val $iw = new $iwC.<init>
  222. }
  223. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  224. def <init>() = {
  225. super.<init>;
  226. ()
  227. };
  228. val INSTANCE = new $read.<init>
  229. }
  230. [running phase parser on <console>]
  231. [running phase namer on <console>]
  232. [running phase packageobjects on <console>]
  233. [running phase typer on <console>]
  234. [running phase patmat on <console>]
  235. [running phase repl on <console>]
  236. [running phase superaccessors on <console>]
  237. [running phase extmethods on <console>]
  238. [running phase pickler on <console>]
  239. [running phase refchecks on <console>]
  240. [running phase uncurry on <console>]
  241. [running phase tailcalls on <console>]
  242. [running phase specialize on <console>]
  243. [running phase explicitouter on <console>]
  244. [running phase erasure on <console>]
  245. [running phase posterasure on <console>]
  246. [running phase lazyvals on <console>]
  247. [running phase lambdalift on <console>]
  248. [running phase constructors on <console>]
  249. [running phase flatten on <console>]
  250. [running phase mixin on <console>]
  251. [running phase cleanup on <console>]
  252. [running phase icode on <console>]
  253. [running phase inliner on <console>]
  254. [running phase inlineExceptionHandlers on <console>]
  255. [running phase closelim on <console>]
  256. [running phase dce on <console>]
  257. [running phase jvm on icode]
  258. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $intp to <method> <stable> <accessor> val $intp(): org.apache.spark.repl.SparkIMain
  259. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  260. object $eval {
  261. lazy val $result = $line2.$read.INSTANCE.$iw.$iw.`$intp`
  262. val $print: String = {
  263. $read.INSTANCE.$iw.$iw
  264. (""
  265.  
  266. + "$intp: repl.this.SparkIMain = " + scala.runtime.ScalaRunTime.replStringOf($line2.$read.INSTANCE.$iw.$iw.`$intp`, 1000)
  267.  
  268. )
  269. }
  270. }
  271.  
  272. ") Some(List(object $eval extends scala.AnyRef {
  273. def <init>() = {
  274. super.<init>();
  275. ()
  276. };
  277. lazy val $result = $line2.$read.INSTANCE.$iw.$iw.$intp;
  278. val $print: String = {
  279. $read.INSTANCE.$iw.$iw;
  280. "".$plus("$intp: repl.this.SparkIMain = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line2.$read.INSTANCE.$iw.$iw.$intp, 1000))
  281. }
  282. }))
  283. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  284. def <init>() = {
  285. super.<init>;
  286. ()
  287. };
  288. lazy val $result = $line2.$read.INSTANCE.$iw.$iw.$intp;
  289. val $print: String = {
  290. $read.INSTANCE.$iw.$iw;
  291. "".+("$intp: repl.this.SparkIMain = ").+(scala.runtime.ScalaRunTime.replStringOf($line2.$read.INSTANCE.$iw.$iw.$intp, 1000))
  292. }
  293. }
  294. [running phase parser on <console>]
  295. [running phase namer on <console>]
  296. [running phase packageobjects on <console>]
  297. [running phase typer on <console>]
  298. [running phase patmat on <console>]
  299. [running phase repl on <console>]
  300. [running phase superaccessors on <console>]
  301. [running phase extmethods on <console>]
  302. [running phase pickler on <console>]
  303. [running phase refchecks on <console>]
  304. [running phase uncurry on <console>]
  305. [running phase tailcalls on <console>]
  306. [running phase specialize on <console>]
  307. [running phase explicitouter on <console>]
  308. [running phase erasure on <console>]
  309. [running phase posterasure on <console>]
  310. [running phase lazyvals on <console>]
  311. [running phase lambdalift on <console>]
  312. [running phase constructors on <console>]
  313. [running phase flatten on <console>]
  314. [running phase mixin on <console>]
  315. [running phase cleanup on <console>]
  316. [running phase icode on <console>]
  317. [running phase inliner on <console>]
  318. [running phase inlineExceptionHandlers on <console>]
  319. [running phase closelim on <console>]
  320. [running phase dce on <console>]
  321. [running phase jvm on icode]
  322. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line2.$eval.$print()
  323. Welcome to
  324. ____ __
  325. / __/__ ___ _____/ /__
  326. _\ \/ _ \/ _ `/ __/ '_/
  327. /___/ .__/\_,_/_/ /_/\_\ version 0.9.0
  328. /_/
  329.  
  330. Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_51)
  331. Type in expressions to have them evaluated.
  332. Type :help for more information.
  333. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  334. @transient val sc = org.apache.spark.repl.Main.interp.createSparkContext();
  335.  
  336. ") Some(List(@new transient.<init>() val sc = org.apache.spark.repl.Main.interp.createSparkContext()))
  337. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: 39: ValDef
  338. 25: Apply
  339. 25: Select
  340. 25: New
  341. 25: Ident
  342. 39: TypeTree
  343. 96: Apply
  344. 78: Select
  345. 71: Select
  346. 66: Select
  347. 61: Select
  348. 55: Select
  349. 48: Select
  350. 44: Ident
  351.  
  352. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  353. class $read extends Serializable {
  354. class $iwC extends Serializable {
  355. class $iwC extends Serializable {
  356.  
  357. @transient val sc = org.apache.spark.repl.Main.interp.createSparkContext();
  358.  
  359.  
  360.  
  361.  
  362. }
  363. val $iw = new $iwC;
  364. }
  365. val $iw = new $iwC;
  366.  
  367. }
  368. object $read {
  369. val INSTANCE = new $read();
  370. }
  371.  
  372. ") Some(List(class $read extends Serializable {
  373. def <init>() = {
  374. super.<init>();
  375. ()
  376. };
  377. class $iwC extends Serializable {
  378. def <init>() = {
  379. super.<init>();
  380. ()
  381. };
  382. class $iwC extends Serializable {
  383. def <init>() = {
  384. super.<init>();
  385. ()
  386. };
  387. @new transient.<init>() val sc = org.apache.spark.repl.Main.interp.createSparkContext()
  388. };
  389. val $iw = new $iwC.<init>()
  390. };
  391. val $iw = new $iwC.<init>()
  392. }, object $read extends scala.AnyRef {
  393. def <init>() = {
  394. super.<init>();
  395. ()
  396. };
  397. val INSTANCE = new $read.<init>()
  398. }))
  399. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  400. def <init>() = {
  401. super.<init>;
  402. ()
  403. };
  404. class $iwC extends Serializable {
  405. def <init>() = {
  406. super.<init>;
  407. ()
  408. };
  409. class $iwC extends Serializable {
  410. def <init>() = {
  411. super.<init>;
  412. ()
  413. };
  414. @new transient.<init>() val sc = org.apache.spark.repl.Main.interp.createSparkContext
  415. };
  416. val $iw = new $iwC.<init>
  417. };
  418. val $iw = new $iwC.<init>
  419. }
  420. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  421. def <init>() = {
  422. super.<init>;
  423. ()
  424. };
  425. val INSTANCE = new $read.<init>
  426. }
  427. [running phase parser on <console>]
  428. [running phase namer on <console>]
  429. [running phase packageobjects on <console>]
  430. [running phase typer on <console>]
  431. [running phase patmat on <console>]
  432. [running phase repl on <console>]
  433. [running phase superaccessors on <console>]
  434. [running phase extmethods on <console>]
  435. [running phase pickler on <console>]
  436. [running phase refchecks on <console>]
  437. [running phase uncurry on <console>]
  438. [running phase tailcalls on <console>]
  439. [running phase specialize on <console>]
  440. [running phase explicitouter on <console>]
  441. [running phase erasure on <console>]
  442. [running phase posterasure on <console>]
  443. [running phase lazyvals on <console>]
  444. [running phase lambdalift on <console>]
  445. [running phase constructors on <console>]
  446. [running phase flatten on <console>]
  447. [running phase mixin on <console>]
  448. [running phase cleanup on <console>]
  449. [running phase icode on <console>]
  450. [running phase inliner on <console>]
  451. [running phase inlineExceptionHandlers on <console>]
  452. [running phase closelim on <console>]
  453. [running phase dce on <console>]
  454. [running phase jvm on icode]
  455. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of sc to <method> <stable> <accessor> val sc(): org.apache.spark.SparkContext
  456. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  457. object $eval {
  458. lazy val $result = $line3.$read.INSTANCE.$iw.$iw.`sc`
  459. val $print: String = {
  460. $read.INSTANCE.$iw.$iw
  461. (""
  462.  
  463. + "sc: spark.this.SparkContext = " + scala.runtime.ScalaRunTime.replStringOf($line3.$read.INSTANCE.$iw.$iw.`sc`, 1000)
  464.  
  465. )
  466. }
  467. }
  468.  
  469. ") Some(List(object $eval extends scala.AnyRef {
  470. def <init>() = {
  471. super.<init>();
  472. ()
  473. };
  474. lazy val $result = $line3.$read.INSTANCE.$iw.$iw.sc;
  475. val $print: String = {
  476. $read.INSTANCE.$iw.$iw;
  477. "".$plus("sc: spark.this.SparkContext = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line3.$read.INSTANCE.$iw.$iw.sc, 1000))
  478. }
  479. }))
  480. 14/03/06 11:13:07 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  481. def <init>() = {
  482. super.<init>;
  483. ()
  484. };
  485. lazy val $result = $line3.$read.INSTANCE.$iw.$iw.sc;
  486. val $print: String = {
  487. $read.INSTANCE.$iw.$iw;
  488. "".+("sc: spark.this.SparkContext = ").+(scala.runtime.ScalaRunTime.replStringOf($line3.$read.INSTANCE.$iw.$iw.sc, 1000))
  489. }
  490. }
  491. [running phase parser on <console>]
  492. [running phase namer on <console>]
  493. [running phase packageobjects on <console>]
  494. [running phase typer on <console>]
  495. [running phase patmat on <console>]
  496. [running phase repl on <console>]
  497. [running phase superaccessors on <console>]
  498. [running phase extmethods on <console>]
  499. [running phase pickler on <console>]
  500. [running phase refchecks on <console>]
  501. [running phase uncurry on <console>]
  502. [running phase tailcalls on <console>]
  503. [running phase specialize on <console>]
  504. [running phase explicitouter on <console>]
  505. [running phase erasure on <console>]
  506. [running phase posterasure on <console>]
  507. [running phase lazyvals on <console>]
  508. [running phase lambdalift on <console>]
  509. [running phase constructors on <console>]
  510. [running phase flatten on <console>]
  511. [running phase mixin on <console>]
  512. [running phase cleanup on <console>]
  513. [running phase icode on <console>]
  514. [running phase inliner on <console>]
  515. [running phase inlineExceptionHandlers on <console>]
  516. [running phase closelim on <console>]
  517. [running phase dce on <console>]
  518. [running phase jvm on icode]
  519. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line3.$eval.$print()
  520. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  521. object $eval {
  522. var value: java.lang.Throwable = _
  523. def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
  524. }
  525. ") Some(List(object $eval extends scala.AnyRef {
  526. def <init>() = {
  527. super.<init>();
  528. ()
  529. };
  530. <mutable> <defaultinit> var value: java.lang.Throwable = _;
  531. def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
  532. }))
  533. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  534. def <init>() = {
  535. super.<init>;
  536. ()
  537. };
  538. <mutable> <defaultinit> var value: java.lang.Throwable = _;
  539. def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
  540. }
  541. [running phase parser on <console>]
  542. [running phase namer on <console>]
  543. [running phase packageobjects on <console>]
  544. [running phase typer on <console>]
  545. [running phase patmat on <console>]
  546. [running phase repl on <console>]
  547. [running phase superaccessors on <console>]
  548. [running phase extmethods on <console>]
  549. [running phase pickler on <console>]
  550. [running phase refchecks on <console>]
  551. [running phase uncurry on <console>]
  552. [running phase tailcalls on <console>]
  553. [running phase specialize on <console>]
  554. [running phase explicitouter on <console>]
  555. [running phase erasure on <console>]
  556. [running phase posterasure on <console>]
  557. [running phase lazyvals on <console>]
  558. [running phase lambdalift on <console>]
  559. [running phase constructors on <console>]
  560. [running phase flatten on <console>]
  561. [running phase mixin on <console>]
  562. [running phase cleanup on <console>]
  563. [running phase icode on <console>]
  564. [running phase inliner on <console>]
  565. [running phase inlineExceptionHandlers on <console>]
  566. [running phase closelim on <console>]
  567. [running phase dce on <console>]
  568. [running phase jvm on icode]
  569. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line4.$eval.set(java.lang.Object)
  570. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: with args: java.lang.NullPointerException
  571. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val lastException = $line4.$eval.value
  572. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val lastException = $line4.$eval.value
  573. ") Some(List(val lastException = $line4.$eval.value))
  574. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
  575. 11: TypeTree
  576. 40: Select
  577. 34: Select
  578. 27: Ident
  579.  
  580. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  581. class $read extends Serializable {
  582. class $iwC extends Serializable {
  583. class $iwC extends Serializable {
  584. val lastException = $line4.$eval.value
  585.  
  586.  
  587.  
  588. }
  589. val $iw = new $iwC;
  590. }
  591. val $iw = new $iwC;
  592.  
  593. }
  594. object $read {
  595. val INSTANCE = new $read();
  596. }
  597.  
  598. ") Some(List(class $read extends Serializable {
  599. def <init>() = {
  600. super.<init>();
  601. ()
  602. };
  603. class $iwC extends Serializable {
  604. def <init>() = {
  605. super.<init>();
  606. ()
  607. };
  608. class $iwC extends Serializable {
  609. def <init>() = {
  610. super.<init>();
  611. ()
  612. };
  613. val lastException = $line4.$eval.value
  614. };
  615. val $iw = new $iwC.<init>()
  616. };
  617. val $iw = new $iwC.<init>()
  618. }, object $read extends scala.AnyRef {
  619. def <init>() = {
  620. super.<init>();
  621. ()
  622. };
  623. val INSTANCE = new $read.<init>()
  624. }))
  625. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  626. def <init>() = {
  627. super.<init>;
  628. ()
  629. };
  630. class $iwC extends Serializable {
  631. def <init>() = {
  632. super.<init>;
  633. ()
  634. };
  635. class $iwC extends Serializable {
  636. def <init>() = {
  637. super.<init>;
  638. ()
  639. };
  640. val lastException = $line4.$eval.value
  641. };
  642. val $iw = new $iwC.<init>
  643. };
  644. val $iw = new $iwC.<init>
  645. }
  646. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  647. def <init>() = {
  648. super.<init>;
  649. ()
  650. };
  651. val INSTANCE = new $read.<init>
  652. }
  653. [running phase parser on <console>]
  654. [running phase namer on <console>]
  655. [running phase packageobjects on <console>]
  656. [running phase typer on <console>]
  657. [running phase patmat on <console>]
  658. [running phase repl on <console>]
  659. [running phase superaccessors on <console>]
  660. [running phase extmethods on <console>]
  661. [running phase pickler on <console>]
  662. [running phase refchecks on <console>]
  663. [running phase uncurry on <console>]
  664. [running phase tailcalls on <console>]
  665. [running phase specialize on <console>]
  666. [running phase explicitouter on <console>]
  667. [running phase erasure on <console>]
  668. [running phase posterasure on <console>]
  669. [running phase lazyvals on <console>]
  670. [running phase lambdalift on <console>]
  671. [running phase constructors on <console>]
  672. [running phase flatten on <console>]
  673. [running phase mixin on <console>]
  674. [running phase cleanup on <console>]
  675. [running phase icode on <console>]
  676. [running phase inliner on <console>]
  677. [running phase inlineExceptionHandlers on <console>]
  678. [running phase closelim on <console>]
  679. [running phase dce on <console>]
  680. [running phase jvm on icode]
  681. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of lastException to <method> <stable> <accessor> val lastException(): Throwable
  682. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  683. object $eval {
  684. lazy val $result = $line5.$read.INSTANCE.$iw.$iw.`lastException`
  685. val $print: String = {
  686. $read.INSTANCE.$iw.$iw
  687. (""
  688.  
  689. + "lastException: lang.this.Throwable = " + scala.runtime.ScalaRunTime.replStringOf($line5.$read.INSTANCE.$iw.$iw.`lastException`, 1000)
  690.  
  691. )
  692. }
  693. }
  694.  
  695. ") Some(List(object $eval extends scala.AnyRef {
  696. def <init>() = {
  697. super.<init>();
  698. ()
  699. };
  700. lazy val $result = $line5.$read.INSTANCE.$iw.$iw.lastException;
  701. val $print: String = {
  702. $read.INSTANCE.$iw.$iw;
  703. "".$plus("lastException: lang.this.Throwable = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line5.$read.INSTANCE.$iw.$iw.lastException, 1000))
  704. }
  705. }))
  706. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  707. def <init>() = {
  708. super.<init>;
  709. ()
  710. };
  711. lazy val $result = $line5.$read.INSTANCE.$iw.$iw.lastException;
  712. val $print: String = {
  713. $read.INSTANCE.$iw.$iw;
  714. "".+("lastException: lang.this.Throwable = ").+(scala.runtime.ScalaRunTime.replStringOf($line5.$read.INSTANCE.$iw.$iw.lastException, 1000))
  715. }
  716. }
  717. [running phase parser on <console>]
  718. [running phase namer on <console>]
  719. [running phase packageobjects on <console>]
  720. [running phase typer on <console>]
  721. [running phase patmat on <console>]
  722. [running phase repl on <console>]
  723. [running phase superaccessors on <console>]
  724. [running phase extmethods on <console>]
  725. [running phase pickler on <console>]
  726. [running phase refchecks on <console>]
  727. [running phase uncurry on <console>]
  728. [running phase tailcalls on <console>]
  729. [running phase specialize on <console>]
  730. [running phase explicitouter on <console>]
  731. [running phase erasure on <console>]
  732. [running phase posterasure on <console>]
  733. [running phase lazyvals on <console>]
  734. [running phase lambdalift on <console>]
  735. [running phase constructors on <console>]
  736. [running phase flatten on <console>]
  737. [running phase mixin on <console>]
  738. [running phase cleanup on <console>]
  739. [running phase icode on <console>]
  740. [running phase inliner on <console>]
  741. [running phase inlineExceptionHandlers on <console>]
  742. [running phase closelim on <console>]
  743. [running phase dce on <console>]
  744. [running phase jvm on icode]
  745. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line5.$eval.$print()
  746. java.lang.NullPointerException
  747. at $iwC$$iwC.<init>(<console>:8)
  748. at $iwC.<init>(<console>:14)
  749. at <init>(<console>:16)
  750. at .<init>(<console>:20)
  751. at .<clinit>(<console>)
  752. at .<init>(<console>:7)
  753. at .<clinit>(<console>)
  754. at $print(<console>)
  755. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  756. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  757. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  758. at java.lang.reflect.Method.invoke(Method.java:606)
  759. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:772)
  760. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1040)
  761. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:609)
  762. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:640)
  763. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:604)
  764. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:788)
  765. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:833)
  766. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:745)
  767. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:119)
  768. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:118)
  769. at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:258)
  770. at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:118)
  771. at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:53)
  772. at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:903)
  773. at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:140)
  774. at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:53)
  775. at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:102)
  776. at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:53)
  777. at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:920)
  778. at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:876)
  779. at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:876)
  780. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  781. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:876)
  782. at com.wellmo.reporting.Repl$.run(Repl.scala:31)
  783. at com.wellmo.reporting.WellmoReporting$.run(WellmoReporting.scala:71)
  784. at com.wellmo.reporting.WellmoReporting$$anonfun$main$1.apply(WellmoReporting.scala:49)
  785. at com.wellmo.reporting.WellmoReporting$$anonfun$main$1.apply(WellmoReporting.scala:48)
  786. at scala.Option.map(Option.scala:145)
  787. at com.wellmo.reporting.WellmoReporting$.main(WellmoReporting.scala:48)
  788. at com.wellmo.reporting.WellmoReporting.main(WellmoReporting.scala)
  789.  
  790. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" import org.apache.spark.SparkContext._
  791. ") Some(List(import org.apache.spark.SparkContext._))
  792. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 14: Import
  793. 31: Select
  794. 25: Select
  795. 18: Select
  796. 14: Ident
  797.  
  798. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  799. class $read extends Serializable {
  800. class $iwC extends Serializable {
  801. class $iwC extends Serializable {
  802. import org.apache.spark.SparkContext._
  803.  
  804.  
  805.  
  806. }
  807. val $iw = new $iwC;
  808. }
  809. val $iw = new $iwC;
  810.  
  811. }
  812. object $read {
  813. val INSTANCE = new $read();
  814. }
  815.  
  816. ") Some(List(class $read extends Serializable {
  817. def <init>() = {
  818. super.<init>();
  819. ()
  820. };
  821. class $iwC extends Serializable {
  822. def <init>() = {
  823. super.<init>();
  824. ()
  825. };
  826. class $iwC extends Serializable {
  827. def <init>() = {
  828. super.<init>();
  829. ()
  830. };
  831. import org.apache.spark.SparkContext._
  832. };
  833. val $iw = new $iwC.<init>()
  834. };
  835. val $iw = new $iwC.<init>()
  836. }, object $read extends scala.AnyRef {
  837. def <init>() = {
  838. super.<init>();
  839. ()
  840. };
  841. val INSTANCE = new $read.<init>()
  842. }))
  843. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  844. def <init>() = {
  845. super.<init>;
  846. ()
  847. };
  848. class $iwC extends Serializable {
  849. def <init>() = {
  850. super.<init>;
  851. ()
  852. };
  853. class $iwC extends Serializable {
  854. def <init>() = {
  855. super.<init>;
  856. ()
  857. };
  858. import org.apache.spark.SparkContext._
  859. };
  860. val $iw = new $iwC.<init>
  861. };
  862. val $iw = new $iwC.<init>
  863. }
  864. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  865. def <init>() = {
  866. super.<init>;
  867. ()
  868. };
  869. val INSTANCE = new $read.<init>
  870. }
  871. [running phase parser on <console>]
  872. [running phase namer on <console>]
  873. [running phase packageobjects on <console>]
  874. [running phase typer on <console>]
  875. [running phase patmat on <console>]
  876. [running phase repl on <console>]
  877. [running phase superaccessors on <console>]
  878. [running phase extmethods on <console>]
  879. [running phase pickler on <console>]
  880. [running phase refchecks on <console>]
  881. [running phase uncurry on <console>]
  882. [running phase tailcalls on <console>]
  883. [running phase specialize on <console>]
  884. [running phase explicitouter on <console>]
  885. [running phase erasure on <console>]
  886. [running phase posterasure on <console>]
  887. [running phase lazyvals on <console>]
  888. [running phase lambdalift on <console>]
  889. [running phase constructors on <console>]
  890. [running phase flatten on <console>]
  891. [running phase mixin on <console>]
  892. [running phase cleanup on <console>]
  893. [running phase icode on <console>]
  894. [running phase inliner on <console>]
  895. [running phase inlineExceptionHandlers on <console>]
  896. [running phase closelim on <console>]
  897. [running phase dce on <console>]
  898. [running phase jvm on icode]
  899. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  900. object $eval {
  901.  
  902. val $print: String = {
  903. $read.INSTANCE.$iw.$iw
  904. (""
  905.  
  906. + "import org.apache.spark.SparkContext._" + "\u000A"
  907.  
  908.  
  909. )
  910. }
  911. }
  912.  
  913. ") Some(List(object $eval extends scala.AnyRef {
  914. def <init>() = {
  915. super.<init>();
  916. ()
  917. };
  918. val $print: String = {
  919. $read.INSTANCE.$iw.$iw;
  920. "".$plus("import org.apache.spark.SparkContext._").$plus("\n")
  921. }
  922. }))
  923. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  924. def <init>() = {
  925. super.<init>;
  926. ()
  927. };
  928. val $print: String = {
  929. $read.INSTANCE.$iw.$iw;
  930. "".+("import org.apache.spark.SparkContext._").+("\n")
  931. }
  932. }
  933. [running phase parser on <console>]
  934. [running phase namer on <console>]
  935. [running phase packageobjects on <console>]
  936. [running phase typer on <console>]
  937. [running phase patmat on <console>]
  938. [running phase repl on <console>]
  939. [running phase superaccessors on <console>]
  940. [running phase extmethods on <console>]
  941. [running phase pickler on <console>]
  942. [running phase refchecks on <console>]
  943. [running phase uncurry on <console>]
  944. [running phase tailcalls on <console>]
  945. [running phase specialize on <console>]
  946. [running phase explicitouter on <console>]
  947. [running phase erasure on <console>]
  948. [running phase posterasure on <console>]
  949. [running phase lazyvals on <console>]
  950. [running phase lambdalift on <console>]
  951. [running phase constructors on <console>]
  952. [running phase flatten on <console>]
  953. [running phase mixin on <console>]
  954. [running phase cleanup on <console>]
  955. [running phase icode on <console>]
  956. [running phase inliner on <console>]
  957. [running phase inlineExceptionHandlers on <console>]
  958. [running phase closelim on <console>]
  959. [running phase dce on <console>]
  960. [running phase jvm on icode]
  961. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line6.$eval.$print()
  962. Spark context available as sc.
  963. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  964. object $eval {
  965. var value: org.apache.spark.SparkContext = _
  966. def set(x: Any) = value = x.asInstanceOf[org.apache.spark.SparkContext]
  967. }
  968. ") Some(List(object $eval extends scala.AnyRef {
  969. def <init>() = {
  970. super.<init>();
  971. ()
  972. };
  973. <mutable> <defaultinit> var value: org.apache.spark.SparkContext = _;
  974. def set(x: Any) = value = x.asInstanceOf[org.apache.spark.SparkContext]
  975. }))
  976. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  977. def <init>() = {
  978. super.<init>;
  979. ()
  980. };
  981. <mutable> <defaultinit> var value: org.apache.spark.SparkContext = _;
  982. def set(x: Any) = value = x.asInstanceOf[org.apache.spark.SparkContext]
  983. }
  984. [running phase parser on <console>]
  985. [running phase namer on <console>]
  986. [running phase packageobjects on <console>]
  987. [running phase typer on <console>]
  988. [running phase patmat on <console>]
  989. [running phase repl on <console>]
  990. [running phase superaccessors on <console>]
  991. [running phase extmethods on <console>]
  992. [running phase pickler on <console>]
  993. [running phase refchecks on <console>]
  994. [running phase uncurry on <console>]
  995. [running phase tailcalls on <console>]
  996. [running phase specialize on <console>]
  997. [running phase explicitouter on <console>]
  998. [running phase erasure on <console>]
  999. [running phase posterasure on <console>]
  1000. [running phase lazyvals on <console>]
  1001. [running phase lambdalift on <console>]
  1002. [running phase constructors on <console>]
  1003. [running phase flatten on <console>]
  1004. [running phase mixin on <console>]
  1005. [running phase cleanup on <console>]
  1006. [running phase icode on <console>]
  1007. [running phase inliner on <console>]
  1008. [running phase inlineExceptionHandlers on <console>]
  1009. [running phase closelim on <console>]
  1010. [running phase dce on <console>]
  1011. [running phase jvm on icode]
  1012. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line7.$eval.set(java.lang.Object)
  1013. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: with args: org.apache.spark.SparkContext@6345a68f
  1014. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val sc = $line7.$eval.value
  1015. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val sc = $line7.$eval.value
  1016. ") Some(List(val sc = $line7.$eval.value))
  1017. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
  1018. 11: TypeTree
  1019. 29: Select
  1020. 23: Select
  1021. 16: Ident
  1022.  
  1023. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" def $ires0 = {
  1024. org.apache.spark.SparkContext
  1025. }
  1026. ") Some(List(def $ires0 = org.apache.spark.SparkContext))
  1027. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: DefDef
  1028. 11: TypeTree
  1029. 46: Select
  1030. 40: Select
  1031. 33: Select
  1032. 29: Ident
  1033.  
  1034. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" def $ires1 = {
  1035. org.apache.spark.SparkContext
  1036. }
  1037. ") Some(List(def $ires1 = org.apache.spark.SparkContext))
  1038. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: DefDef
  1039. 11: TypeTree
  1040. 46: Select
  1041. 40: Select
  1042. 33: Select
  1043. 29: Ident
  1044.  
  1045. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse(" def $ires2 = {
  1046. org.apache.spark.SparkContext
  1047. }
  1048. ") Some(List(def $ires2 = org.apache.spark.SparkContext))
  1049. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: 11: DefDef
  1050. 11: TypeTree
  1051. 46: Select
  1052. 40: Select
  1053. 33: Select
  1054. 29: Ident
  1055.  
  1056. 14/03/06 11:13:08 DEBUG SparkIMain$exprTyper: Terminating typeOfExpression recursion for expression: org.apache.spark.SparkContext
  1057. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1058. class $read extends Serializable {
  1059. class $iwC extends Serializable {
  1060. class $iwC extends Serializable {
  1061. import org.apache.spark.SparkContext._
  1062. class $iwC extends Serializable {
  1063. class $iwC extends Serializable {
  1064. def $ires2 = {
  1065. org.apache.spark.SparkContext
  1066. }
  1067.  
  1068.  
  1069.  
  1070. }
  1071. val $iw = new $iwC;
  1072. }
  1073. val $iw = new $iwC;
  1074. }
  1075. val $iw = new $iwC;
  1076. }
  1077. val $iw = new $iwC;
  1078.  
  1079. }
  1080. object $read {
  1081. val INSTANCE = new $read();
  1082. }
  1083.  
  1084. ") Some(List(class $read extends Serializable {
  1085. def <init>() = {
  1086. super.<init>();
  1087. ()
  1088. };
  1089. class $iwC extends Serializable {
  1090. def <init>() = {
  1091. super.<init>();
  1092. ()
  1093. };
  1094. class $iwC extends Serializable {
  1095. def <init>() = {
  1096. super.<init>();
  1097. ()
  1098. };
  1099. import org.apache.spark.SparkContext._;
  1100. class $iwC extends Serializable {
  1101. def <init>() = {
  1102. super.<init>();
  1103. ()
  1104. };
  1105. class $iwC extends Serializable {
  1106. def <init>() = {
  1107. super.<init>();
  1108. ()
  1109. };
  1110. def $ires2 = org.apache.spark.SparkContext
  1111. };
  1112. val $iw = new $iwC.<init>()
  1113. };
  1114. val $iw = new $iwC.<init>()
  1115. };
  1116. val $iw = new $iwC.<init>()
  1117. };
  1118. val $iw = new $iwC.<init>()
  1119. }, object $read extends scala.AnyRef {
  1120. def <init>() = {
  1121. super.<init>();
  1122. ()
  1123. };
  1124. val INSTANCE = new $read.<init>()
  1125. }))
  1126. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  1127. def <init>() = {
  1128. super.<init>;
  1129. ()
  1130. };
  1131. class $iwC extends Serializable {
  1132. def <init>() = {
  1133. super.<init>;
  1134. ()
  1135. };
  1136. class $iwC extends Serializable {
  1137. def <init>() = {
  1138. super.<init>;
  1139. ()
  1140. };
  1141. import org.apache.spark.SparkContext._;
  1142. class $iwC extends Serializable {
  1143. def <init>() = {
  1144. super.<init>;
  1145. ()
  1146. };
  1147. class $iwC extends Serializable {
  1148. def <init>() = {
  1149. super.<init>;
  1150. ()
  1151. };
  1152. def $ires2 = org.apache.spark.SparkContext
  1153. };
  1154. val $iw = new $iwC.<init>
  1155. };
  1156. val $iw = new $iwC.<init>
  1157. };
  1158. val $iw = new $iwC.<init>
  1159. };
  1160. val $iw = new $iwC.<init>
  1161. }
  1162. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  1163. def <init>() = {
  1164. super.<init>;
  1165. ()
  1166. };
  1167. val INSTANCE = new $read.<init>
  1168. }
  1169. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $ires2 to <method> def $ires2(): org.apache.spark.SparkContext.type
  1170. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1171. object $eval {
  1172. lazy val $result = $line11.$read.INSTANCE.$iw.$iw.$iw.$iw.`$ires2`
  1173. val $print: String = {
  1174. $read.INSTANCE.$iw.$iw.$iw.$iw
  1175. (""
  1176.  
  1177. + "$ires2" + ": " + "<root>.this.org.apache.spark.SparkContext.type" + "\u000A"
  1178.  
  1179. )
  1180. }
  1181. }
  1182.  
  1183. ") Some(List(object $eval extends scala.AnyRef {
  1184. def <init>() = {
  1185. super.<init>();
  1186. ()
  1187. };
  1188. lazy val $result = $line11.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires2;
  1189. val $print: String = {
  1190. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1191. "".$plus("$ires2").$plus(": ").$plus("<root>.this.org.apache.spark.SparkContext.type").$plus("\n")
  1192. }
  1193. }))
  1194. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  1195. def <init>() = {
  1196. super.<init>;
  1197. ()
  1198. };
  1199. lazy val $result = $line11.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires2;
  1200. val $print: String = {
  1201. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1202. "".+("$ires2").+(": ").+("<root>.this.org.apache.spark.SparkContext.type").+("\n")
  1203. }
  1204. }
  1205. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line11.$eval.$print()
  1206. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1207. class $read extends Serializable {
  1208. class $iwC extends Serializable {
  1209. class $iwC extends Serializable {
  1210. import org.apache.spark.SparkContext._
  1211. class $iwC extends Serializable {
  1212. class $iwC extends Serializable {
  1213. def $ires1 = {
  1214. org.apache.spark.SparkContext
  1215. }
  1216.  
  1217.  
  1218.  
  1219. }
  1220. val $iw = new $iwC;
  1221. }
  1222. val $iw = new $iwC;
  1223. }
  1224. val $iw = new $iwC;
  1225. }
  1226. val $iw = new $iwC;
  1227.  
  1228. }
  1229. object $read {
  1230. val INSTANCE = new $read();
  1231. }
  1232.  
  1233. ") Some(List(class $read extends Serializable {
  1234. def <init>() = {
  1235. super.<init>();
  1236. ()
  1237. };
  1238. class $iwC extends Serializable {
  1239. def <init>() = {
  1240. super.<init>();
  1241. ()
  1242. };
  1243. class $iwC extends Serializable {
  1244. def <init>() = {
  1245. super.<init>();
  1246. ()
  1247. };
  1248. import org.apache.spark.SparkContext._;
  1249. class $iwC extends Serializable {
  1250. def <init>() = {
  1251. super.<init>();
  1252. ()
  1253. };
  1254. class $iwC extends Serializable {
  1255. def <init>() = {
  1256. super.<init>();
  1257. ()
  1258. };
  1259. def $ires1 = org.apache.spark.SparkContext
  1260. };
  1261. val $iw = new $iwC.<init>()
  1262. };
  1263. val $iw = new $iwC.<init>()
  1264. };
  1265. val $iw = new $iwC.<init>()
  1266. };
  1267. val $iw = new $iwC.<init>()
  1268. }, object $read extends scala.AnyRef {
  1269. def <init>() = {
  1270. super.<init>();
  1271. ()
  1272. };
  1273. val INSTANCE = new $read.<init>()
  1274. }))
  1275. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  1276. def <init>() = {
  1277. super.<init>;
  1278. ()
  1279. };
  1280. class $iwC extends Serializable {
  1281. def <init>() = {
  1282. super.<init>;
  1283. ()
  1284. };
  1285. class $iwC extends Serializable {
  1286. def <init>() = {
  1287. super.<init>;
  1288. ()
  1289. };
  1290. import org.apache.spark.SparkContext._;
  1291. class $iwC extends Serializable {
  1292. def <init>() = {
  1293. super.<init>;
  1294. ()
  1295. };
  1296. class $iwC extends Serializable {
  1297. def <init>() = {
  1298. super.<init>;
  1299. ()
  1300. };
  1301. def $ires1 = org.apache.spark.SparkContext
  1302. };
  1303. val $iw = new $iwC.<init>
  1304. };
  1305. val $iw = new $iwC.<init>
  1306. };
  1307. val $iw = new $iwC.<init>
  1308. };
  1309. val $iw = new $iwC.<init>
  1310. }
  1311. 14/03/06 11:13:08 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  1312. def <init>() = {
  1313. super.<init>;
  1314. ()
  1315. };
  1316. val INSTANCE = new $read.<init>
  1317. }
  1318. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $ires1 to <method> def $ires1(): org.apache.spark.SparkContext.type
  1319. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1320. object $eval {
  1321. lazy val $result = $line10.$read.INSTANCE.$iw.$iw.$iw.$iw.`$ires1`
  1322. val $print: String = {
  1323. $read.INSTANCE.$iw.$iw.$iw.$iw
  1324. (""
  1325.  
  1326. + "$ires1" + ": " + "<root>.this.org.apache.spark.SparkContext.type" + "\u000A"
  1327.  
  1328. )
  1329. }
  1330. }
  1331.  
  1332. ") Some(List(object $eval extends scala.AnyRef {
  1333. def <init>() = {
  1334. super.<init>();
  1335. ()
  1336. };
  1337. lazy val $result = $line10.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires1;
  1338. val $print: String = {
  1339. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1340. "".$plus("$ires1").$plus(": ").$plus("<root>.this.org.apache.spark.SparkContext.type").$plus("\n")
  1341. }
  1342. }))
  1343. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  1344. def <init>() = {
  1345. super.<init>;
  1346. ()
  1347. };
  1348. lazy val $result = $line10.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires1;
  1349. val $print: String = {
  1350. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1351. "".+("$ires1").+(": ").+("<root>.this.org.apache.spark.SparkContext.type").+("\n")
  1352. }
  1353. }
  1354. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line10.$eval.$print()
  1355. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1356. class $read extends Serializable {
  1357. class $iwC extends Serializable {
  1358. class $iwC extends Serializable {
  1359. import org.apache.spark.SparkContext._
  1360. class $iwC extends Serializable {
  1361. class $iwC extends Serializable {
  1362. def $ires0 = {
  1363. org.apache.spark.SparkContext
  1364. }
  1365.  
  1366.  
  1367.  
  1368. }
  1369. val $iw = new $iwC;
  1370. }
  1371. val $iw = new $iwC;
  1372. }
  1373. val $iw = new $iwC;
  1374. }
  1375. val $iw = new $iwC;
  1376.  
  1377. }
  1378. object $read {
  1379. val INSTANCE = new $read();
  1380. }
  1381.  
  1382. ") Some(List(class $read extends Serializable {
  1383. def <init>() = {
  1384. super.<init>();
  1385. ()
  1386. };
  1387. class $iwC extends Serializable {
  1388. def <init>() = {
  1389. super.<init>();
  1390. ()
  1391. };
  1392. class $iwC extends Serializable {
  1393. def <init>() = {
  1394. super.<init>();
  1395. ()
  1396. };
  1397. import org.apache.spark.SparkContext._;
  1398. class $iwC extends Serializable {
  1399. def <init>() = {
  1400. super.<init>();
  1401. ()
  1402. };
  1403. class $iwC extends Serializable {
  1404. def <init>() = {
  1405. super.<init>();
  1406. ()
  1407. };
  1408. def $ires0 = org.apache.spark.SparkContext
  1409. };
  1410. val $iw = new $iwC.<init>()
  1411. };
  1412. val $iw = new $iwC.<init>()
  1413. };
  1414. val $iw = new $iwC.<init>()
  1415. };
  1416. val $iw = new $iwC.<init>()
  1417. }, object $read extends scala.AnyRef {
  1418. def <init>() = {
  1419. super.<init>();
  1420. ()
  1421. };
  1422. val INSTANCE = new $read.<init>()
  1423. }))
  1424. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  1425. def <init>() = {
  1426. super.<init>;
  1427. ()
  1428. };
  1429. class $iwC extends Serializable {
  1430. def <init>() = {
  1431. super.<init>;
  1432. ()
  1433. };
  1434. class $iwC extends Serializable {
  1435. def <init>() = {
  1436. super.<init>;
  1437. ()
  1438. };
  1439. import org.apache.spark.SparkContext._;
  1440. class $iwC extends Serializable {
  1441. def <init>() = {
  1442. super.<init>;
  1443. ()
  1444. };
  1445. class $iwC extends Serializable {
  1446. def <init>() = {
  1447. super.<init>;
  1448. ()
  1449. };
  1450. def $ires0 = org.apache.spark.SparkContext
  1451. };
  1452. val $iw = new $iwC.<init>
  1453. };
  1454. val $iw = new $iwC.<init>
  1455. };
  1456. val $iw = new $iwC.<init>
  1457. };
  1458. val $iw = new $iwC.<init>
  1459. }
  1460. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  1461. def <init>() = {
  1462. super.<init>;
  1463. ()
  1464. };
  1465. val INSTANCE = new $read.<init>
  1466. }
  1467. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of $ires0 to <method> def $ires0(): org.apache.spark.SparkContext.type
  1468. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1469. object $eval {
  1470. lazy val $result = $line9.$read.INSTANCE.$iw.$iw.$iw.$iw.`$ires0`
  1471. val $print: String = {
  1472. $read.INSTANCE.$iw.$iw.$iw.$iw
  1473. (""
  1474.  
  1475. + "$ires0" + ": " + "<root>.this.org.apache.spark.SparkContext.type" + "\u000A"
  1476.  
  1477. )
  1478. }
  1479. }
  1480.  
  1481. ") Some(List(object $eval extends scala.AnyRef {
  1482. def <init>() = {
  1483. super.<init>();
  1484. ()
  1485. };
  1486. lazy val $result = $line9.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires0;
  1487. val $print: String = {
  1488. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1489. "".$plus("$ires0").$plus(": ").$plus("<root>.this.org.apache.spark.SparkContext.type").$plus("\n")
  1490. }
  1491. }))
  1492. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  1493. def <init>() = {
  1494. super.<init>;
  1495. ()
  1496. };
  1497. lazy val $result = $line9.$read.INSTANCE.$iw.$iw.$iw.$iw.$ires0;
  1498. val $print: String = {
  1499. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1500. "".+("$ires0").+(": ").+("<root>.this.org.apache.spark.SparkContext.type").+("\n")
  1501. }
  1502. }
  1503. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line9.$eval.$print()
  1504. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1505. class $read extends Serializable {
  1506. class $iwC extends Serializable {
  1507. class $iwC extends Serializable {
  1508. import org.apache.spark.SparkContext._
  1509. class $iwC extends Serializable {
  1510. class $iwC extends Serializable {
  1511. val sc = $line7.$eval.value
  1512.  
  1513.  
  1514.  
  1515. }
  1516. val $iw = new $iwC;
  1517. }
  1518. val $iw = new $iwC;
  1519. }
  1520. val $iw = new $iwC;
  1521. }
  1522. val $iw = new $iwC;
  1523.  
  1524. }
  1525. object $read {
  1526. val INSTANCE = new $read();
  1527. }
  1528.  
  1529. ") Some(List(class $read extends Serializable {
  1530. def <init>() = {
  1531. super.<init>();
  1532. ()
  1533. };
  1534. class $iwC extends Serializable {
  1535. def <init>() = {
  1536. super.<init>();
  1537. ()
  1538. };
  1539. class $iwC extends Serializable {
  1540. def <init>() = {
  1541. super.<init>();
  1542. ()
  1543. };
  1544. import org.apache.spark.SparkContext._;
  1545. class $iwC extends Serializable {
  1546. def <init>() = {
  1547. super.<init>();
  1548. ()
  1549. };
  1550. class $iwC extends Serializable {
  1551. def <init>() = {
  1552. super.<init>();
  1553. ()
  1554. };
  1555. val sc = $line7.$eval.value
  1556. };
  1557. val $iw = new $iwC.<init>()
  1558. };
  1559. val $iw = new $iwC.<init>()
  1560. };
  1561. val $iw = new $iwC.<init>()
  1562. };
  1563. val $iw = new $iwC.<init>()
  1564. }, object $read extends scala.AnyRef {
  1565. def <init>() = {
  1566. super.<init>();
  1567. ()
  1568. };
  1569. val INSTANCE = new $read.<init>()
  1570. }))
  1571. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  1572. def <init>() = {
  1573. super.<init>;
  1574. ()
  1575. };
  1576. class $iwC extends Serializable {
  1577. def <init>() = {
  1578. super.<init>;
  1579. ()
  1580. };
  1581. class $iwC extends Serializable {
  1582. def <init>() = {
  1583. super.<init>;
  1584. ()
  1585. };
  1586. import org.apache.spark.SparkContext._;
  1587. class $iwC extends Serializable {
  1588. def <init>() = {
  1589. super.<init>;
  1590. ()
  1591. };
  1592. class $iwC extends Serializable {
  1593. def <init>() = {
  1594. super.<init>;
  1595. ()
  1596. };
  1597. val sc = $line7.$eval.value
  1598. };
  1599. val $iw = new $iwC.<init>
  1600. };
  1601. val $iw = new $iwC.<init>
  1602. };
  1603. val $iw = new $iwC.<init>
  1604. };
  1605. val $iw = new $iwC.<init>
  1606. }
  1607. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  1608. def <init>() = {
  1609. super.<init>;
  1610. ()
  1611. };
  1612. val INSTANCE = new $read.<init>
  1613. }
  1614. [running phase parser on <console>]
  1615. [running phase namer on <console>]
  1616. [running phase packageobjects on <console>]
  1617. [running phase typer on <console>]
  1618. [running phase patmat on <console>]
  1619. [running phase repl on <console>]
  1620. [running phase superaccessors on <console>]
  1621. [running phase extmethods on <console>]
  1622. [running phase pickler on <console>]
  1623. [running phase refchecks on <console>]
  1624. [running phase uncurry on <console>]
  1625. [running phase tailcalls on <console>]
  1626. [running phase specialize on <console>]
  1627. [running phase explicitouter on <console>]
  1628. [running phase erasure on <console>]
  1629. [running phase posterasure on <console>]
  1630. [running phase lazyvals on <console>]
  1631. [running phase lambdalift on <console>]
  1632. [running phase constructors on <console>]
  1633. [running phase flatten on <console>]
  1634. [running phase mixin on <console>]
  1635. [running phase cleanup on <console>]
  1636. [running phase icode on <console>]
  1637. [running phase inliner on <console>]
  1638. [running phase inlineExceptionHandlers on <console>]
  1639. [running phase closelim on <console>]
  1640. [running phase dce on <console>]
  1641. [running phase jvm on icode]
  1642. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of sc to <method> <stable> <accessor> val sc(): org.apache.spark.SparkContext
  1643. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1644. object $eval {
  1645. lazy val $result = $line8.$read.INSTANCE.$iw.$iw.$iw.$iw.`sc`
  1646. val $print: String = {
  1647. $read.INSTANCE.$iw.$iw.$iw.$iw
  1648. (""
  1649.  
  1650. + "sc: spark.this.SparkContext = " + scala.runtime.ScalaRunTime.replStringOf($line8.$read.INSTANCE.$iw.$iw.$iw.$iw.`sc`, 1000)
  1651.  
  1652. )
  1653. }
  1654. }
  1655.  
  1656. ") Some(List(object $eval extends scala.AnyRef {
  1657. def <init>() = {
  1658. super.<init>();
  1659. ()
  1660. };
  1661. lazy val $result = $line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc;
  1662. val $print: String = {
  1663. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1664. "".$plus("sc: spark.this.SparkContext = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc, 1000))
  1665. }
  1666. }))
  1667. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  1668. def <init>() = {
  1669. super.<init>;
  1670. ()
  1671. };
  1672. lazy val $result = $line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc;
  1673. val $print: String = {
  1674. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1675. "".+("sc: spark.this.SparkContext = ").+(scala.runtime.ScalaRunTime.replStringOf($line8.$read.INSTANCE.$iw.$iw.$iw.$iw.sc, 1000))
  1676. }
  1677. }
  1678. [running phase parser on <console>]
  1679. [running phase namer on <console>]
  1680. [running phase packageobjects on <console>]
  1681. [running phase typer on <console>]
  1682. [running phase patmat on <console>]
  1683. [running phase repl on <console>]
  1684. [running phase superaccessors on <console>]
  1685. [running phase extmethods on <console>]
  1686. [running phase pickler on <console>]
  1687. [running phase refchecks on <console>]
  1688. [running phase uncurry on <console>]
  1689. [running phase tailcalls on <console>]
  1690. [running phase specialize on <console>]
  1691. [running phase explicitouter on <console>]
  1692. [running phase erasure on <console>]
  1693. [running phase posterasure on <console>]
  1694. [running phase lazyvals on <console>]
  1695. [running phase lambdalift on <console>]
  1696. [running phase constructors on <console>]
  1697. [running phase flatten on <console>]
  1698. [running phase mixin on <console>]
  1699. [running phase cleanup on <console>]
  1700. [running phase icode on <console>]
  1701. [running phase inliner on <console>]
  1702. [running phase inlineExceptionHandlers on <console>]
  1703. [running phase closelim on <console>]
  1704. [running phase dce on <console>]
  1705. [running phase jvm on icode]
  1706. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line8.$eval.$print()
  1707. sc: spark.this.SparkContext = org.apache.spark.SparkContext@6345a68f
  1708. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1709. object $eval {
  1710. var value: com.wellmo.reporting.WellmoData = _
  1711. def set(x: Any) = value = x.asInstanceOf[com.wellmo.reporting.WellmoData]
  1712. }
  1713. ") Some(List(object $eval extends scala.AnyRef {
  1714. def <init>() = {
  1715. super.<init>();
  1716. ()
  1717. };
  1718. <mutable> <defaultinit> var value: com.wellmo.reporting.WellmoData = _;
  1719. def set(x: Any) = value = x.asInstanceOf[com.wellmo.reporting.WellmoData]
  1720. }))
  1721. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  1722. def <init>() = {
  1723. super.<init>;
  1724. ()
  1725. };
  1726. <mutable> <defaultinit> var value: com.wellmo.reporting.WellmoData = _;
  1727. def set(x: Any) = value = x.asInstanceOf[com.wellmo.reporting.WellmoData]
  1728. }
  1729. [running phase parser on <console>]
  1730. [running phase namer on <console>]
  1731. [running phase packageobjects on <console>]
  1732. [running phase typer on <console>]
  1733. [running phase patmat on <console>]
  1734. [running phase repl on <console>]
  1735. [running phase superaccessors on <console>]
  1736. [running phase extmethods on <console>]
  1737. [running phase pickler on <console>]
  1738. [running phase refchecks on <console>]
  1739. [running phase uncurry on <console>]
  1740. [running phase tailcalls on <console>]
  1741. [running phase specialize on <console>]
  1742. [running phase explicitouter on <console>]
  1743. [running phase erasure on <console>]
  1744. [running phase posterasure on <console>]
  1745. [running phase lazyvals on <console>]
  1746. [running phase lambdalift on <console>]
  1747. [running phase constructors on <console>]
  1748. [running phase flatten on <console>]
  1749. [running phase mixin on <console>]
  1750. [running phase cleanup on <console>]
  1751. [running phase icode on <console>]
  1752. [running phase inliner on <console>]
  1753. [running phase inlineExceptionHandlers on <console>]
  1754. [running phase closelim on <console>]
  1755. [running phase dce on <console>]
  1756. [running phase jvm on icode]
  1757. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line12.$eval.set(java.lang.Object)
  1758. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: with args: com.wellmo.reporting.WellmoData@68362b9b
  1759. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val data = $line12.$eval.value
  1760. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val data = $line12.$eval.value
  1761. ") Some(List(val data = $line12.$eval.value))
  1762. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
  1763. 11: TypeTree
  1764. 32: Select
  1765. 26: Select
  1766. 18: Ident
  1767.  
  1768. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1769. class $read extends Serializable {
  1770. class $iwC extends Serializable {
  1771. class $iwC extends Serializable {
  1772. import org.apache.spark.SparkContext._
  1773. class $iwC extends Serializable {
  1774. class $iwC extends Serializable {
  1775. val data = $line12.$eval.value
  1776.  
  1777.  
  1778.  
  1779. }
  1780. val $iw = new $iwC;
  1781. }
  1782. val $iw = new $iwC;
  1783. }
  1784. val $iw = new $iwC;
  1785. }
  1786. val $iw = new $iwC;
  1787.  
  1788. }
  1789. object $read {
  1790. val INSTANCE = new $read();
  1791. }
  1792.  
  1793. ") Some(List(class $read extends Serializable {
  1794. def <init>() = {
  1795. super.<init>();
  1796. ()
  1797. };
  1798. class $iwC extends Serializable {
  1799. def <init>() = {
  1800. super.<init>();
  1801. ()
  1802. };
  1803. class $iwC extends Serializable {
  1804. def <init>() = {
  1805. super.<init>();
  1806. ()
  1807. };
  1808. import org.apache.spark.SparkContext._;
  1809. class $iwC extends Serializable {
  1810. def <init>() = {
  1811. super.<init>();
  1812. ()
  1813. };
  1814. class $iwC extends Serializable {
  1815. def <init>() = {
  1816. super.<init>();
  1817. ()
  1818. };
  1819. val data = $line12.$eval.value
  1820. };
  1821. val $iw = new $iwC.<init>()
  1822. };
  1823. val $iw = new $iwC.<init>()
  1824. };
  1825. val $iw = new $iwC.<init>()
  1826. };
  1827. val $iw = new $iwC.<init>()
  1828. }, object $read extends scala.AnyRef {
  1829. def <init>() = {
  1830. super.<init>();
  1831. ()
  1832. };
  1833. val INSTANCE = new $read.<init>()
  1834. }))
  1835. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  1836. def <init>() = {
  1837. super.<init>;
  1838. ()
  1839. };
  1840. class $iwC extends Serializable {
  1841. def <init>() = {
  1842. super.<init>;
  1843. ()
  1844. };
  1845. class $iwC extends Serializable {
  1846. def <init>() = {
  1847. super.<init>;
  1848. ()
  1849. };
  1850. import org.apache.spark.SparkContext._;
  1851. class $iwC extends Serializable {
  1852. def <init>() = {
  1853. super.<init>;
  1854. ()
  1855. };
  1856. class $iwC extends Serializable {
  1857. def <init>() = {
  1858. super.<init>;
  1859. ()
  1860. };
  1861. val data = $line12.$eval.value
  1862. };
  1863. val $iw = new $iwC.<init>
  1864. };
  1865. val $iw = new $iwC.<init>
  1866. };
  1867. val $iw = new $iwC.<init>
  1868. };
  1869. val $iw = new $iwC.<init>
  1870. }
  1871. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  1872. def <init>() = {
  1873. super.<init>;
  1874. ()
  1875. };
  1876. val INSTANCE = new $read.<init>
  1877. }
  1878. [running phase parser on <console>]
  1879. [running phase namer on <console>]
  1880. [running phase packageobjects on <console>]
  1881. [running phase typer on <console>]
  1882. [running phase patmat on <console>]
  1883. [running phase repl on <console>]
  1884. [running phase superaccessors on <console>]
  1885. [running phase extmethods on <console>]
  1886. [running phase pickler on <console>]
  1887. [running phase refchecks on <console>]
  1888. [running phase uncurry on <console>]
  1889. [running phase tailcalls on <console>]
  1890. [running phase specialize on <console>]
  1891. [running phase explicitouter on <console>]
  1892. [running phase erasure on <console>]
  1893. [running phase posterasure on <console>]
  1894. [running phase lazyvals on <console>]
  1895. [running phase lambdalift on <console>]
  1896. [running phase constructors on <console>]
  1897. [running phase flatten on <console>]
  1898. [running phase mixin on <console>]
  1899. [running phase cleanup on <console>]
  1900. [running phase icode on <console>]
  1901. [running phase inliner on <console>]
  1902. [running phase inlineExceptionHandlers on <console>]
  1903. [running phase closelim on <console>]
  1904. [running phase dce on <console>]
  1905. [running phase jvm on icode]
  1906. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of data to <method> <stable> <accessor> val data(): com.wellmo.reporting.WellmoData
  1907. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1908. object $eval {
  1909. lazy val $result = $line13.$read.INSTANCE.$iw.$iw.$iw.$iw.`data`
  1910. val $print: String = {
  1911. $read.INSTANCE.$iw.$iw.$iw.$iw
  1912. (""
  1913.  
  1914. + "data: reporting.this.WellmoData = " + scala.runtime.ScalaRunTime.replStringOf($line13.$read.INSTANCE.$iw.$iw.$iw.$iw.`data`, 1000)
  1915.  
  1916. )
  1917. }
  1918. }
  1919.  
  1920. ") Some(List(object $eval extends scala.AnyRef {
  1921. def <init>() = {
  1922. super.<init>();
  1923. ()
  1924. };
  1925. lazy val $result = $line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data;
  1926. val $print: String = {
  1927. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1928. "".$plus("data: reporting.this.WellmoData = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data, 1000))
  1929. }
  1930. }))
  1931. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  1932. def <init>() = {
  1933. super.<init>;
  1934. ()
  1935. };
  1936. lazy val $result = $line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data;
  1937. val $print: String = {
  1938. $read.INSTANCE.$iw.$iw.$iw.$iw;
  1939. "".+("data: reporting.this.WellmoData = ").+(scala.runtime.ScalaRunTime.replStringOf($line13.$read.INSTANCE.$iw.$iw.$iw.$iw.data, 1000))
  1940. }
  1941. }
  1942. [running phase parser on <console>]
  1943. [running phase namer on <console>]
  1944. [running phase packageobjects on <console>]
  1945. [running phase typer on <console>]
  1946. [running phase patmat on <console>]
  1947. [running phase repl on <console>]
  1948. [running phase superaccessors on <console>]
  1949. [running phase extmethods on <console>]
  1950. [running phase pickler on <console>]
  1951. [running phase refchecks on <console>]
  1952. [running phase uncurry on <console>]
  1953. [running phase tailcalls on <console>]
  1954. [running phase specialize on <console>]
  1955. [running phase explicitouter on <console>]
  1956. [running phase erasure on <console>]
  1957. [running phase posterasure on <console>]
  1958. [running phase lazyvals on <console>]
  1959. [running phase lambdalift on <console>]
  1960. [running phase constructors on <console>]
  1961. [running phase flatten on <console>]
  1962. [running phase mixin on <console>]
  1963. [running phase cleanup on <console>]
  1964. [running phase icode on <console>]
  1965. [running phase inliner on <console>]
  1966. [running phase inlineExceptionHandlers on <console>]
  1967. [running phase closelim on <console>]
  1968. [running phase dce on <console>]
  1969. [running phase jvm on icode]
  1970. 14/03/06 11:13:09 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line13.$eval.$print()
  1971. data: reporting.this.WellmoData = com.wellmo.reporting.WellmoData@68362b9b
  1972.  
  1973. scala> data.profile.count()
  1974. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse(" data.profile.count()
  1975. ") Some(List(data.profile.count()))
  1976. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: 25: Apply
  1977. 20: Select
  1978. 12: Select
  1979. 7: Ident
  1980.  
  1981. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val res0 =
  1982. data.profile.count()
  1983. ") Some(List(val res0 = data.profile.count()))
  1984. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
  1985. 11: TypeTree
  1986. 50: Apply
  1987. 45: Select
  1988. 37: Select
  1989. 32: Ident
  1990.  
  1991. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  1992. class $read extends Serializable {
  1993. class $iwC extends Serializable {
  1994. class $iwC extends Serializable {
  1995. import org.apache.spark.SparkContext._
  1996. class $iwC extends Serializable {
  1997. val $VAL1 = $line13.$read.INSTANCE;
  1998. import $VAL1.$iw.$iw.$iw.$iw.`data`;
  1999. class $iwC extends Serializable {
  2000. val res0 =
  2001. data.profile.count()
  2002.  
  2003.  
  2004.  
  2005. }
  2006. val $iw = new $iwC;
  2007. }
  2008. val $iw = new $iwC;
  2009. }
  2010. val $iw = new $iwC;
  2011. }
  2012. val $iw = new $iwC;
  2013.  
  2014. }
  2015. object $read {
  2016. val INSTANCE = new $read();
  2017. }
  2018.  
  2019. ") Some(List(class $read extends Serializable {
  2020. def <init>() = {
  2021. super.<init>();
  2022. ()
  2023. };
  2024. class $iwC extends Serializable {
  2025. def <init>() = {
  2026. super.<init>();
  2027. ()
  2028. };
  2029. class $iwC extends Serializable {
  2030. def <init>() = {
  2031. super.<init>();
  2032. ()
  2033. };
  2034. import org.apache.spark.SparkContext._;
  2035. class $iwC extends Serializable {
  2036. def <init>() = {
  2037. super.<init>();
  2038. ()
  2039. };
  2040. val $VAL1 = $line13.$read.INSTANCE;
  2041. import $VAL1.$iw.$iw.$iw.$iw.data;
  2042. class $iwC extends Serializable {
  2043. def <init>() = {
  2044. super.<init>();
  2045. ()
  2046. };
  2047. val res0 = data.profile.count()
  2048. };
  2049. val $iw = new $iwC.<init>()
  2050. };
  2051. val $iw = new $iwC.<init>()
  2052. };
  2053. val $iw = new $iwC.<init>()
  2054. };
  2055. val $iw = new $iwC.<init>()
  2056. }, object $read extends scala.AnyRef {
  2057. def <init>() = {
  2058. super.<init>();
  2059. ()
  2060. };
  2061. val INSTANCE = new $read.<init>()
  2062. }))
  2063. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  2064. def <init>() = {
  2065. super.<init>;
  2066. ()
  2067. };
  2068. class $iwC extends Serializable {
  2069. def <init>() = {
  2070. super.<init>;
  2071. ()
  2072. };
  2073. class $iwC extends Serializable {
  2074. def <init>() = {
  2075. super.<init>;
  2076. ()
  2077. };
  2078. import org.apache.spark.SparkContext._;
  2079. class $iwC extends Serializable {
  2080. def <init>() = {
  2081. super.<init>;
  2082. ()
  2083. };
  2084. val $VAL1 = $line13.$read.INSTANCE;
  2085. import $VAL1.$iw.$iw.$iw.$iw.data;
  2086. class $iwC extends Serializable {
  2087. def <init>() = {
  2088. super.<init>;
  2089. ()
  2090. };
  2091. val res0 = data.profile.count
  2092. };
  2093. val $iw = new $iwC.<init>
  2094. };
  2095. val $iw = new $iwC.<init>
  2096. };
  2097. val $iw = new $iwC.<init>
  2098. };
  2099. val $iw = new $iwC.<init>
  2100. }
  2101. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  2102. def <init>() = {
  2103. super.<init>;
  2104. ()
  2105. };
  2106. val INSTANCE = new $read.<init>
  2107. }
  2108. [running phase parser on <console>]
  2109. [running phase namer on <console>]
  2110. [running phase packageobjects on <console>]
  2111. [running phase typer on <console>]
  2112. [running phase patmat on <console>]
  2113. [running phase repl on <console>]
  2114. [running phase superaccessors on <console>]
  2115. [running phase extmethods on <console>]
  2116. [running phase pickler on <console>]
  2117. [running phase refchecks on <console>]
  2118. [running phase uncurry on <console>]
  2119. [running phase tailcalls on <console>]
  2120. [running phase specialize on <console>]
  2121. [running phase explicitouter on <console>]
  2122. [running phase erasure on <console>]
  2123. [running phase posterasure on <console>]
  2124. [running phase lazyvals on <console>]
  2125. [running phase lambdalift on <console>]
  2126. [running phase constructors on <console>]
  2127. [running phase flatten on <console>]
  2128. [running phase mixin on <console>]
  2129. [running phase cleanup on <console>]
  2130. [running phase icode on <console>]
  2131. [running phase inliner on <console>]
  2132. [running phase inlineExceptionHandlers on <console>]
  2133. [running phase closelim on <console>]
  2134. [running phase dce on <console>]
  2135. [running phase jvm on icode]
  2136. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of res0 to <method> <stable> <accessor> val res0(): Long
  2137. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  2138. object $eval {
  2139. lazy val $result = $line14.$read.INSTANCE.$iw.$iw.$iw.$iw.`res0`
  2140. val $print: String = {
  2141. $read.INSTANCE.$iw.$iw.$iw.$iw
  2142. (""
  2143.  
  2144. + "res0: scala.this.Long = " + scala.runtime.ScalaRunTime.replStringOf($line14.$read.INSTANCE.$iw.$iw.$iw.$iw.`res0`, 1000)
  2145.  
  2146. )
  2147. }
  2148. }
  2149.  
  2150. ") Some(List(object $eval extends scala.AnyRef {
  2151. def <init>() = {
  2152. super.<init>();
  2153. ()
  2154. };
  2155. lazy val $result = $line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0;
  2156. val $print: String = {
  2157. $read.INSTANCE.$iw.$iw.$iw.$iw;
  2158. "".$plus("res0: scala.this.Long = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0, 1000))
  2159. }
  2160. }))
  2161. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  2162. def <init>() = {
  2163. super.<init>;
  2164. ()
  2165. };
  2166. lazy val $result = $line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0;
  2167. val $print: String = {
  2168. $read.INSTANCE.$iw.$iw.$iw.$iw;
  2169. "".+("res0: scala.this.Long = ").+(scala.runtime.ScalaRunTime.replStringOf($line14.$read.INSTANCE.$iw.$iw.$iw.$iw.res0, 1000))
  2170. }
  2171. }
  2172. [running phase parser on <console>]
  2173. [running phase namer on <console>]
  2174. [running phase packageobjects on <console>]
  2175. [running phase typer on <console>]
  2176. [running phase patmat on <console>]
  2177. [running phase repl on <console>]
  2178. [running phase superaccessors on <console>]
  2179. [running phase extmethods on <console>]
  2180. [running phase pickler on <console>]
  2181. [running phase refchecks on <console>]
  2182. [running phase uncurry on <console>]
  2183. [running phase tailcalls on <console>]
  2184. [running phase specialize on <console>]
  2185. [running phase explicitouter on <console>]
  2186. [running phase erasure on <console>]
  2187. [running phase posterasure on <console>]
  2188. [running phase lazyvals on <console>]
  2189. [running phase lambdalift on <console>]
  2190. [running phase constructors on <console>]
  2191. [running phase flatten on <console>]
  2192. [running phase mixin on <console>]
  2193. [running phase cleanup on <console>]
  2194. [running phase icode on <console>]
  2195. [running phase inliner on <console>]
  2196. [running phase inlineExceptionHandlers on <console>]
  2197. [running phase closelim on <console>]
  2198. [running phase dce on <console>]
  2199. [running phase jvm on icode]
  2200. 14/03/06 11:13:15 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line14.$eval.$print()
  2201. 14/03/06 11:13:15 INFO SparkContext: Starting job: count at <console>:13
  2202. 14/03/06 11:13:15 INFO DAGScheduler: Got job 0 (count at <console>:13) with 1 output partitions (allowLocal=false)
  2203. 14/03/06 11:13:15 INFO DAGScheduler: Final stage: Stage 0 (count at <console>:13)
  2204. 14/03/06 11:13:15 INFO DAGScheduler: Parents of final stage: List()
  2205. 14/03/06 11:13:15 INFO DAGScheduler: Missing parents: List()
  2206. 14/03/06 11:13:15 DEBUG DAGScheduler: submitStage(Stage 0)
  2207. 14/03/06 11:13:15 DEBUG DAGScheduler: missing: List()
  2208. 14/03/06 11:13:15 INFO DAGScheduler: Submitting Stage 0 (FilteredRDD[6] at filter at Mongo.scala:28), which has no missing parents
  2209. 14/03/06 11:13:15 DEBUG DAGScheduler: submitMissingTasks(Stage 0)
  2210. 14/03/06 11:13:15 INFO DAGScheduler: Submitting 1 missing tasks from Stage 0 (FilteredRDD[6] at filter at Mongo.scala:28)
  2211. 14/03/06 11:13:15 DEBUG DAGScheduler: New pending tasks: Set(ResultTask(0, 0))
  2212. 14/03/06 11:13:15 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
  2213. 14/03/06 11:13:15 DEBUG TaskSetManager: Epoch for TaskSet 0.0: 0
  2214. 14/03/06 11:13:15 DEBUG TaskSetManager: Valid locality levels for TaskSet 0.0: ANY
  2215. 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
  2216. 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
  2217. 14/03/06 11:13:15 INFO TaskSetManager: Starting task 0.0:0 as TID 0 on executor localhost: localhost (PROCESS_LOCAL)
  2218. 14/03/06 11:13:15 INFO TaskSetManager: Serialized task 0.0:0 as 1798 bytes in 5 ms
  2219. 14/03/06 11:13:15 INFO Executor: Running task ID 0
  2220. 14/03/06 11:13:15 DEBUG BlockManager: Getting local block broadcast_2
  2221. 14/03/06 11:13:15 DEBUG BlockManager: Level for block broadcast_2 is StorageLevel(true, true, true, 1)
  2222. 14/03/06 11:13:15 DEBUG BlockManager: Getting block broadcast_2 from memory
  2223. 14/03/06 11:13:15 INFO BlockManager: Found block broadcast_2 locally
  2224. 14/03/06 11:13:15 DEBUG Executor: Task 0's epoch is 0
  2225. 14/03/06 11:13:15 DEBUG CacheManager: Looking for partition rdd_6_0
  2226. 14/03/06 11:13:15 DEBUG BlockManager: Getting local block rdd_6_0
  2227. 14/03/06 11:13:15 DEBUG BlockManager: Block rdd_6_0 not registered locally
  2228. 14/03/06 11:13:15 DEBUG BlockManager: Getting remote block rdd_6_0
  2229. 14/03/06 11:13:15 DEBUG BlockManager: Block rdd_6_0 not found
  2230. 14/03/06 11:13:15 INFO CacheManager: Partition rdd_6_0 not found, computing it
  2231. 14/03/06 11:13:15 INFO NewHadoopRDD: Input split: MongoInputSplit{URI=mongodb://127.0.0.1:27017/wellness.Profile, authURI=null, min={ }, max={ }, query={ }, sort={ }, fields={ }, notimeout=false}
  2232. 14/03/06 11:13:15 INFO MongoRecordReader: Read 2.0 documents from:
  2233. 14/03/06 11:13:15 INFO MongoRecordReader: MongoInputSplit{URI=mongodb://127.0.0.1:27017/wellness.Profile, authURI=null, min={ }, max={ }, query={ }, sort={ }, fields={ }, notimeout=false}
  2234. 14/03/06 11:13:15 INFO MemoryStore: ensureFreeSpace(536) called with curMem=422572, maxMem=2202324172
  2235. 14/03/06 11:13:15 INFO MemoryStore: Block rdd_6_0 stored as values to memory (estimated size 536.0 B, free 2.1 GB)
  2236. 14/03/06 11:13:15 INFO BlockManagerMasterActor$BlockManagerInfo: Added rdd_6_0 in memory on 172.16.168.1:43022 (size: 536.0 B, free: 2.1 GB)
  2237. 14/03/06 11:13:15 INFO BlockManagerMaster: Updated info of block rdd_6_0
  2238. 14/03/06 11:13:15 DEBUG BlockManager: Told master about block rdd_6_0
  2239. 14/03/06 11:13:15 DEBUG BlockManager: Put block rdd_6_0 locally took 4 ms
  2240. 14/03/06 11:13:15 DEBUG BlockManager: Put for block rdd_6_0 without replication took 5 ms
  2241. 14/03/06 11:13:15 INFO Executor: Serialized size of result for 0 is 563
  2242. 14/03/06 11:13:15 INFO Executor: Sending result for 0 directly to driver
  2243. 14/03/06 11:13:15 INFO Executor: Finished task ID 0
  2244. 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
  2245. 14/03/06 11:13:15 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
  2246. 14/03/06 11:13:15 INFO TaskSetManager: Finished TID 0 in 69 ms on localhost (progress: 0/1)
  2247. 14/03/06 11:13:15 INFO TaskSchedulerImpl: Remove TaskSet 0.0 from pool
  2248. 14/03/06 11:13:15 INFO DAGScheduler: Completed ResultTask(0, 0)
  2249. 14/03/06 11:13:15 INFO DAGScheduler: Stage 0 (count at <console>:13) finished in 0.075 s
  2250. 14/03/06 11:13:15 DEBUG DAGScheduler: After removal of stage 0, remaining stages = 0
  2251. 14/03/06 11:13:15 INFO SparkContext: Job finished: count at <console>:13, took 0.160291812 s
  2252. res0: scala.this.Long = 1
  2253.  
  2254. scala> data.profile.filter(p => p.email == "test@example.org").count()
  2255. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse(" data.profile.filter(p => p.email == "test@example.org").count()
  2256. ") Some(List(data.profile.filter(((p) => p.email.$eq$eq("test@example.org"))).count()))
  2257. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: 68: Apply
  2258. 63: Select
  2259. 26: Apply
  2260. 20: Select
  2261. 12: Select
  2262. 7: Ident
  2263. 29: Function
  2264. 27: ValDef
  2265. 27: TypeTree
  2266. -1: EmptyTree
  2267. 40: Apply
  2268. 40: Select
  2269. 34: Select
  2270. 32: Ident
  2271. 43: Literal
  2272.  
  2273. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val res1 =
  2274. data.profile.filter(p => p.email == "test@example.org").count()
  2275. ") Some(List(val res1 = data.profile.filter(((p) => p.email.$eq$eq("test@example.org"))).count()))
  2276. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
  2277. 11: TypeTree
  2278. 93: Apply
  2279. 88: Select
  2280. 51: Apply
  2281. 45: Select
  2282. 37: Select
  2283. 32: Ident
  2284. 54: Function
  2285. 52: ValDef
  2286. 52: TypeTree
  2287. -1: EmptyTree
  2288. 65: Apply
  2289. 65: Select
  2290. 59: Select
  2291. 57: Ident
  2292. 68: Literal
  2293.  
  2294. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  2295. class $read extends Serializable {
  2296. class $iwC extends Serializable {
  2297. class $iwC extends Serializable {
  2298. import org.apache.spark.SparkContext._
  2299. class $iwC extends Serializable {
  2300. val $VAL2 = $line13.$read.INSTANCE;
  2301. import $VAL2.$iw.$iw.$iw.$iw.`data`;
  2302. class $iwC extends Serializable {
  2303. val res1 =
  2304. data.profile.filter(p => p.email == "test@example.org").count()
  2305.  
  2306.  
  2307.  
  2308. }
  2309. val $iw = new $iwC;
  2310. }
  2311. val $iw = new $iwC;
  2312. }
  2313. val $iw = new $iwC;
  2314. }
  2315. val $iw = new $iwC;
  2316.  
  2317. }
  2318. object $read {
  2319. val INSTANCE = new $read();
  2320. }
  2321.  
  2322. ") Some(List(class $read extends Serializable {
  2323. def <init>() = {
  2324. super.<init>();
  2325. ()
  2326. };
  2327. class $iwC extends Serializable {
  2328. def <init>() = {
  2329. super.<init>();
  2330. ()
  2331. };
  2332. class $iwC extends Serializable {
  2333. def <init>() = {
  2334. super.<init>();
  2335. ()
  2336. };
  2337. import org.apache.spark.SparkContext._;
  2338. class $iwC extends Serializable {
  2339. def <init>() = {
  2340. super.<init>();
  2341. ()
  2342. };
  2343. val $VAL2 = $line13.$read.INSTANCE;
  2344. import $VAL2.$iw.$iw.$iw.$iw.data;
  2345. class $iwC extends Serializable {
  2346. def <init>() = {
  2347. super.<init>();
  2348. ()
  2349. };
  2350. val res1 = data.profile.filter(((p) => p.email.$eq$eq("test@example.org"))).count()
  2351. };
  2352. val $iw = new $iwC.<init>()
  2353. };
  2354. val $iw = new $iwC.<init>()
  2355. };
  2356. val $iw = new $iwC.<init>()
  2357. };
  2358. val $iw = new $iwC.<init>()
  2359. }, object $read extends scala.AnyRef {
  2360. def <init>() = {
  2361. super.<init>();
  2362. ()
  2363. };
  2364. val INSTANCE = new $read.<init>()
  2365. }))
  2366. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  2367. def <init>() = {
  2368. super.<init>;
  2369. ()
  2370. };
  2371. class $iwC extends Serializable {
  2372. def <init>() = {
  2373. super.<init>;
  2374. ()
  2375. };
  2376. class $iwC extends Serializable {
  2377. def <init>() = {
  2378. super.<init>;
  2379. ()
  2380. };
  2381. import org.apache.spark.SparkContext._;
  2382. class $iwC extends Serializable {
  2383. def <init>() = {
  2384. super.<init>;
  2385. ()
  2386. };
  2387. val $VAL2 = $line13.$read.INSTANCE;
  2388. import $VAL2.$iw.$iw.$iw.$iw.data;
  2389. class $iwC extends Serializable {
  2390. def <init>() = {
  2391. super.<init>;
  2392. ()
  2393. };
  2394. val res1 = data.profile.filter(((p) => p.email.==("test@example.org"))).count
  2395. };
  2396. val $iw = new $iwC.<init>
  2397. };
  2398. val $iw = new $iwC.<init>
  2399. };
  2400. val $iw = new $iwC.<init>
  2401. };
  2402. val $iw = new $iwC.<init>
  2403. }
  2404. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  2405. def <init>() = {
  2406. super.<init>;
  2407. ()
  2408. };
  2409. val INSTANCE = new $read.<init>
  2410. }
  2411. [running phase parser on <console>]
  2412. [running phase namer on <console>]
  2413. [running phase packageobjects on <console>]
  2414. [running phase typer on <console>]
  2415. [running phase patmat on <console>]
  2416. [running phase repl on <console>]
  2417. [running phase superaccessors on <console>]
  2418. [running phase extmethods on <console>]
  2419. [running phase pickler on <console>]
  2420. [running phase refchecks on <console>]
  2421. [running phase uncurry on <console>]
  2422. [running phase tailcalls on <console>]
  2423. [running phase specialize on <console>]
  2424. [running phase explicitouter on <console>]
  2425. [running phase erasure on <console>]
  2426. [running phase posterasure on <console>]
  2427. [running phase lazyvals on <console>]
  2428. [running phase lambdalift on <console>]
  2429. [running phase constructors on <console>]
  2430. [running phase flatten on <console>]
  2431. [running phase mixin on <console>]
  2432. [running phase cleanup on <console>]
  2433. [running phase icode on <console>]
  2434. [running phase inliner on <console>]
  2435. [running phase inlineExceptionHandlers on <console>]
  2436. [running phase closelim on <console>]
  2437. [running phase dce on <console>]
  2438. [running phase jvm on icode]
  2439. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of res1 to <method> <stable> <accessor> val res1(): Long
  2440. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  2441. object $eval {
  2442. lazy val $result = $line15.$read.INSTANCE.$iw.$iw.$iw.$iw.`res1`
  2443. val $print: String = {
  2444. $read.INSTANCE.$iw.$iw.$iw.$iw
  2445. (""
  2446.  
  2447. + "res1: scala.this.Long = " + scala.runtime.ScalaRunTime.replStringOf($line15.$read.INSTANCE.$iw.$iw.$iw.$iw.`res1`, 1000)
  2448.  
  2449. )
  2450. }
  2451. }
  2452.  
  2453. ") Some(List(object $eval extends scala.AnyRef {
  2454. def <init>() = {
  2455. super.<init>();
  2456. ()
  2457. };
  2458. lazy val $result = $line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1;
  2459. val $print: String = {
  2460. $read.INSTANCE.$iw.$iw.$iw.$iw;
  2461. "".$plus("res1: scala.this.Long = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1, 1000))
  2462. }
  2463. }))
  2464. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  2465. def <init>() = {
  2466. super.<init>;
  2467. ()
  2468. };
  2469. lazy val $result = $line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1;
  2470. val $print: String = {
  2471. $read.INSTANCE.$iw.$iw.$iw.$iw;
  2472. "".+("res1: scala.this.Long = ").+(scala.runtime.ScalaRunTime.replStringOf($line15.$read.INSTANCE.$iw.$iw.$iw.$iw.res1, 1000))
  2473. }
  2474. }
  2475. [running phase parser on <console>]
  2476. [running phase namer on <console>]
  2477. [running phase packageobjects on <console>]
  2478. [running phase typer on <console>]
  2479. [running phase patmat on <console>]
  2480. [running phase repl on <console>]
  2481. [running phase superaccessors on <console>]
  2482. [running phase extmethods on <console>]
  2483. [running phase pickler on <console>]
  2484. [running phase refchecks on <console>]
  2485. [running phase uncurry on <console>]
  2486. [running phase tailcalls on <console>]
  2487. [running phase specialize on <console>]
  2488. [running phase explicitouter on <console>]
  2489. [running phase erasure on <console>]
  2490. [running phase posterasure on <console>]
  2491. [running phase lazyvals on <console>]
  2492. [running phase lambdalift on <console>]
  2493. [running phase constructors on <console>]
  2494. [running phase flatten on <console>]
  2495. [running phase mixin on <console>]
  2496. [running phase cleanup on <console>]
  2497. [running phase icode on <console>]
  2498. [running phase inliner on <console>]
  2499. [running phase inlineExceptionHandlers on <console>]
  2500. [running phase closelim on <console>]
  2501. [running phase dce on <console>]
  2502. [running phase jvm on icode]
  2503. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line15.$eval.$print()
  2504. 14/03/06 11:13:21 INFO SparkContext: Starting job: count at <console>:13
  2505. 14/03/06 11:13:21 INFO DAGScheduler: Got job 1 (count at <console>:13) with 1 output partitions (allowLocal=false)
  2506. 14/03/06 11:13:21 INFO DAGScheduler: Final stage: Stage 1 (count at <console>:13)
  2507. 14/03/06 11:13:21 INFO DAGScheduler: Parents of final stage: List()
  2508. 14/03/06 11:13:21 INFO DAGScheduler: Missing parents: List()
  2509. 14/03/06 11:13:21 DEBUG DAGScheduler: submitStage(Stage 1)
  2510. 14/03/06 11:13:21 DEBUG DAGScheduler: missing: List()
  2511. 14/03/06 11:13:21 INFO DAGScheduler: Submitting Stage 1 (FilteredRDD[32] at filter at <console>:13), which has no missing parents
  2512. 14/03/06 11:13:21 DEBUG DAGScheduler: submitMissingTasks(Stage 1)
  2513. 14/03/06 11:13:21 INFO DAGScheduler: Submitting 1 missing tasks from Stage 1 (FilteredRDD[32] at filter at <console>:13)
  2514. 14/03/06 11:13:21 DEBUG DAGScheduler: New pending tasks: Set(ResultTask(1, 0))
  2515. 14/03/06 11:13:21 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
  2516. 14/03/06 11:13:21 DEBUG TaskSetManager: Epoch for TaskSet 1.0: 0
  2517. 14/03/06 11:13:21 DEBUG TaskSetManager: Valid locality levels for TaskSet 1.0: ANY
  2518. 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
  2519. 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
  2520. 14/03/06 11:13:21 INFO TaskSetManager: Starting task 1.0:0 as TID 1 on executor localhost: localhost (PROCESS_LOCAL)
  2521. 14/03/06 11:13:21 INFO TaskSetManager: Serialized task 1.0:0 as 1833 bytes in 1 ms
  2522. 14/03/06 11:13:21 INFO Executor: Running task ID 1
  2523. 14/03/06 11:13:21 DEBUG BlockManager: Getting local block broadcast_2
  2524. 14/03/06 11:13:21 DEBUG BlockManager: Level for block broadcast_2 is StorageLevel(true, true, true, 1)
  2525. 14/03/06 11:13:21 DEBUG BlockManager: Getting block broadcast_2 from memory
  2526. 14/03/06 11:13:21 INFO BlockManager: Found block broadcast_2 locally
  2527. 14/03/06 11:13:21 ERROR Executor: Exception in task ID 1
  2528. java.lang.ClassNotFoundException: $line15.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1
  2529. at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
  2530. at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
  2531. at java.security.AccessController.doPrivileged(Native Method)
  2532. at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
  2533. at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
  2534. at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
  2535. at java.lang.Class.forName0(Native Method)
  2536. at java.lang.Class.forName(Class.java:270)
  2537. at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:37)
  2538. at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
  2539. at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
  2540. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
  2541. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  2542. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
  2543. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
  2544. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
  2545. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  2546. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
  2547. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
  2548. at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:63)
  2549. at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:139)
  2550. at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
  2551. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
  2552. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  2553. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
  2554. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
  2555. at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:62)
  2556. at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:195)
  2557. at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:49)
  2558. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
  2559. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  2560. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  2561. at java.lang.Thread.run(Thread.java:744)
  2562. 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
  2563. 14/03/06 11:13:21 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
  2564. 14/03/06 11:13:21 WARN TaskSetManager: Lost TID 1 (task 1.0:0)
  2565. 14/03/06 11:13:21 WARN TaskSetManager: Loss was due to java.lang.ClassNotFoundException
  2566. java.lang.ClassNotFoundException: $line15.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1
  2567. at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
  2568. at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
  2569. at java.security.AccessController.doPrivileged(Native Method)
  2570. at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
  2571. at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
  2572. at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
  2573. at java.lang.Class.forName0(Native Method)
  2574. at java.lang.Class.forName(Class.java:270)
  2575. at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:37)
  2576. at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
  2577. at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
  2578. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
  2579. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  2580. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
  2581. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
  2582. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
  2583. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  2584. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
  2585. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
  2586. at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:63)
  2587. at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:139)
  2588. at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
  2589. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
  2590. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  2591. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
  2592. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
  2593. at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:62)
  2594. at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:195)
  2595. at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:49)
  2596. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
  2597. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  2598. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  2599. at java.lang.Thread.run(Thread.java:744)
  2600. 14/03/06 11:13:21 ERROR TaskSetManager: Task 1.0:0 failed 1 times; aborting job
  2601. 14/03/06 11:13:21 INFO TaskSchedulerImpl: Remove TaskSet 1.0 from pool
  2602. 14/03/06 11:13:21 DEBUG DAGScheduler: Removing running stage 1
  2603. 14/03/06 11:13:21 INFO DAGScheduler: Failed to run count at <console>:13
  2604. 14/03/06 11:13:21 DEBUG DAGScheduler: Removing pending status for stage 1
  2605. 14/03/06 11:13:21 DEBUG DAGScheduler: After removal of stage 1, remaining stages = 0
  2606. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  2607. object $eval {
  2608. var value: java.lang.Throwable = _
  2609. def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
  2610. }
  2611. ") Some(List(object $eval extends scala.AnyRef {
  2612. def <init>() = {
  2613. super.<init>();
  2614. ()
  2615. };
  2616. <mutable> <defaultinit> var value: java.lang.Throwable = _;
  2617. def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
  2618. }))
  2619. 14/03/06 11:13:21 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  2620. def <init>() = {
  2621. super.<init>;
  2622. ()
  2623. };
  2624. <mutable> <defaultinit> var value: java.lang.Throwable = _;
  2625. def set(x: Any) = value = x.asInstanceOf[java.lang.Throwable]
  2626. }
  2627. [running phase parser on <console>]
  2628. [running phase namer on <console>]
  2629. [running phase packageobjects on <console>]
  2630. [running phase typer on <console>]
  2631. [running phase patmat on <console>]
  2632. [running phase repl on <console>]
  2633. [running phase superaccessors on <console>]
  2634. [running phase extmethods on <console>]
  2635. [running phase pickler on <console>]
  2636. [running phase refchecks on <console>]
  2637. [running phase uncurry on <console>]
  2638. [running phase tailcalls on <console>]
  2639. [running phase specialize on <console>]
  2640. [running phase explicitouter on <console>]
  2641. [running phase erasure on <console>]
  2642. [running phase posterasure on <console>]
  2643. [running phase lazyvals on <console>]
  2644. [running phase lambdalift on <console>]
  2645. [running phase constructors on <console>]
  2646. [running phase flatten on <console>]
  2647. [running phase mixin on <console>]
  2648. [running phase cleanup on <console>]
  2649. [running phase icode on <console>]
  2650. [running phase inliner on <console>]
  2651. [running phase inlineExceptionHandlers on <console>]
  2652. [running phase closelim on <console>]
  2653. [running phase dce on <console>]
  2654. [running phase jvm on icode]
  2655. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static void $line16.$eval.set(java.lang.Object)
  2656. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: with args: org.apache.spark.SparkException: Job aborted: Task 1.0:0 failed 1 times (most recent failure: Exception failure: java.lang.ClassNotFoundException: $line15.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1)
  2657. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Interpreting: val lastException = $line16.$eval.value
  2658. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: parse(" val lastException = $line16.$eval.value
  2659. ") Some(List(val lastException = $line16.$eval.value))
  2660. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: 11: ValDef
  2661. 11: TypeTree
  2662. 41: Select
  2663. 35: Select
  2664. 27: Ident
  2665.  
  2666. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  2667. class $read extends Serializable {
  2668. class $iwC extends Serializable {
  2669. class $iwC extends Serializable {
  2670. import org.apache.spark.SparkContext._
  2671. class $iwC extends Serializable {
  2672. class $iwC extends Serializable {
  2673. val lastException = $line16.$eval.value
  2674.  
  2675.  
  2676.  
  2677. }
  2678. val $iw = new $iwC;
  2679. }
  2680. val $iw = new $iwC;
  2681. }
  2682. val $iw = new $iwC;
  2683. }
  2684. val $iw = new $iwC;
  2685.  
  2686. }
  2687. object $read {
  2688. val INSTANCE = new $read();
  2689. }
  2690.  
  2691. ") Some(List(class $read extends Serializable {
  2692. def <init>() = {
  2693. super.<init>();
  2694. ()
  2695. };
  2696. class $iwC extends Serializable {
  2697. def <init>() = {
  2698. super.<init>();
  2699. ()
  2700. };
  2701. class $iwC extends Serializable {
  2702. def <init>() = {
  2703. super.<init>();
  2704. ()
  2705. };
  2706. import org.apache.spark.SparkContext._;
  2707. class $iwC extends Serializable {
  2708. def <init>() = {
  2709. super.<init>();
  2710. ()
  2711. };
  2712. class $iwC extends Serializable {
  2713. def <init>() = {
  2714. super.<init>();
  2715. ()
  2716. };
  2717. val lastException = $line16.$eval.value
  2718. };
  2719. val $iw = new $iwC.<init>()
  2720. };
  2721. val $iw = new $iwC.<init>()
  2722. };
  2723. val $iw = new $iwC.<init>()
  2724. };
  2725. val $iw = new $iwC.<init>()
  2726. }, object $read extends scala.AnyRef {
  2727. def <init>() = {
  2728. super.<init>();
  2729. ()
  2730. };
  2731. val INSTANCE = new $read.<init>()
  2732. }))
  2733. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: class $read extends Serializable {
  2734. def <init>() = {
  2735. super.<init>;
  2736. ()
  2737. };
  2738. class $iwC extends Serializable {
  2739. def <init>() = {
  2740. super.<init>;
  2741. ()
  2742. };
  2743. class $iwC extends Serializable {
  2744. def <init>() = {
  2745. super.<init>;
  2746. ()
  2747. };
  2748. import org.apache.spark.SparkContext._;
  2749. class $iwC extends Serializable {
  2750. def <init>() = {
  2751. super.<init>;
  2752. ()
  2753. };
  2754. class $iwC extends Serializable {
  2755. def <init>() = {
  2756. super.<init>;
  2757. ()
  2758. };
  2759. val lastException = $line16.$eval.value
  2760. };
  2761. val $iw = new $iwC.<init>
  2762. };
  2763. val $iw = new $iwC.<init>
  2764. };
  2765. val $iw = new $iwC.<init>
  2766. };
  2767. val $iw = new $iwC.<init>
  2768. }
  2769. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: object $read extends scala.AnyRef {
  2770. def <init>() = {
  2771. super.<init>;
  2772. ()
  2773. };
  2774. val INSTANCE = new $read.<init>
  2775. }
  2776. [running phase parser on <console>]
  2777. [running phase namer on <console>]
  2778. [running phase packageobjects on <console>]
  2779. [running phase typer on <console>]
  2780. [running phase patmat on <console>]
  2781. [running phase repl on <console>]
  2782. [running phase superaccessors on <console>]
  2783. [running phase extmethods on <console>]
  2784. [running phase pickler on <console>]
  2785. [running phase refchecks on <console>]
  2786. [running phase uncurry on <console>]
  2787. [running phase tailcalls on <console>]
  2788. [running phase specialize on <console>]
  2789. [running phase explicitouter on <console>]
  2790. [running phase erasure on <console>]
  2791. [running phase posterasure on <console>]
  2792. [running phase lazyvals on <console>]
  2793. [running phase lambdalift on <console>]
  2794. [running phase constructors on <console>]
  2795. [running phase flatten on <console>]
  2796. [running phase mixin on <console>]
  2797. [running phase cleanup on <console>]
  2798. [running phase icode on <console>]
  2799. [running phase inliner on <console>]
  2800. [running phase inlineExceptionHandlers on <console>]
  2801. [running phase closelim on <console>]
  2802. [running phase dce on <console>]
  2803. [running phase jvm on icode]
  2804. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Set symbol of lastException to <method> <stable> <accessor> val lastException(): Throwable
  2805. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: parse("
  2806. object $eval {
  2807. lazy val $result = $line17.$read.INSTANCE.$iw.$iw.$iw.$iw.`lastException`
  2808. val $print: String = {
  2809. $read.INSTANCE.$iw.$iw.$iw.$iw
  2810. (""
  2811.  
  2812. + "lastException: lang.this.Throwable = " + scala.runtime.ScalaRunTime.replStringOf($line17.$read.INSTANCE.$iw.$iw.$iw.$iw.`lastException`, 1000)
  2813.  
  2814. )
  2815. }
  2816. }
  2817.  
  2818. ") Some(List(object $eval extends scala.AnyRef {
  2819. def <init>() = {
  2820. super.<init>();
  2821. ()
  2822. };
  2823. lazy val $result = $line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException;
  2824. val $print: String = {
  2825. $read.INSTANCE.$iw.$iw.$iw.$iw;
  2826. "".$plus("lastException: lang.this.Throwable = ").$plus(scala.runtime.ScalaRunTime.replStringOf($line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException, 1000))
  2827. }
  2828. }))
  2829. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: object $eval extends scala.AnyRef {
  2830. def <init>() = {
  2831. super.<init>;
  2832. ()
  2833. };
  2834. lazy val $result = $line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException;
  2835. val $print: String = {
  2836. $read.INSTANCE.$iw.$iw.$iw.$iw;
  2837. "".+("lastException: lang.this.Throwable = ").+(scala.runtime.ScalaRunTime.replStringOf($line17.$read.INSTANCE.$iw.$iw.$iw.$iw.lastException, 1000))
  2838. }
  2839. }
  2840. [running phase parser on <console>]
  2841. [running phase namer on <console>]
  2842. [running phase packageobjects on <console>]
  2843. [running phase typer on <console>]
  2844. [running phase patmat on <console>]
  2845. [running phase repl on <console>]
  2846. [running phase superaccessors on <console>]
  2847. [running phase extmethods on <console>]
  2848. [running phase pickler on <console>]
  2849. [running phase refchecks on <console>]
  2850. [running phase uncurry on <console>]
  2851. [running phase tailcalls on <console>]
  2852. [running phase specialize on <console>]
  2853. [running phase explicitouter on <console>]
  2854. [running phase erasure on <console>]
  2855. [running phase posterasure on <console>]
  2856. [running phase lazyvals on <console>]
  2857. [running phase lambdalift on <console>]
  2858. [running phase constructors on <console>]
  2859. [running phase flatten on <console>]
  2860. [running phase mixin on <console>]
  2861. [running phase cleanup on <console>]
  2862. [running phase icode on <console>]
  2863. [running phase inliner on <console>]
  2864. [running phase inlineExceptionHandlers on <console>]
  2865. [running phase closelim on <console>]
  2866. [running phase dce on <console>]
  2867. [running phase jvm on icode]
  2868. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String $line17.$eval.$print()
  2869. 14/03/06 11:13:22 DEBUG SparkILoop$SparkILoopInterpreter: Redefining term 'lastException'
  2870. lang.this.Throwable -> lang.this.Throwable
  2871. org.apache.spark.SparkException: Job aborted: Task 1.0:0 failed 1 times (most recent failure: Exception failure: java.lang.ClassNotFoundException: $iwC$$iwC$$iwC$$iwC$$anonfun$1)
  2872. at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1028)
  2873. at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1026)
  2874. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  2875. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
  2876. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1026)
  2877. at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:619)
  2878. at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:619)
  2879. at scala.Option.foreach(Option.scala:236)
  2880. at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:619)
  2881. at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:207)
  2882. at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
  2883. at akka.actor.ActorCell.invoke(ActorCell.scala:456)
  2884. at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
  2885. at akka.dispatch.Mailbox.run(Mailbox.scala:219)
  2886. at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
  2887. at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
  2888. at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
  2889. at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
  2890. at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
  2891.  
  2892.  
  2893. scala>
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement