Advertisement
Guest User

strace

a guest
May 3rd, 2014
105
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 8.38 KB | None | 0 0
  1. root@ip-172-31-10-160 lda]$ ~/bin/sbt run
  2. [info] Loading project definition from /root/lda/lda/project/project
  3. [info] Loading project definition from /root/lda/lda/project
  4. [info] Set current project to org/lda (in build file:/root/lda/lda/)
  5. [info] Running Main
  6. 14/05/04 01:55:44 INFO slf4j.Slf4jLogger: Slf4jLogger started
  7. 14/05/04 01:55:44 INFO Remoting: Starting remoting
  8. 14/05/04 01:55:44 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:51789]
  9. 14/05/04 01:55:44 INFO Remoting: Remoting now listens on addresses: [akka.tcp://[email protected]:51789]
  10. 14/05/04 01:55:44 INFO spark.SparkEnv: Registering BlockManagerMaster
  11. 14/05/04 01:55:44 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140504015544-5752
  12. 14/05/04 01:55:44 INFO storage.MemoryStore: MemoryStore started with capacity 819.3 MB.
  13. 14/05/04 01:55:44 INFO network.ConnectionManager: Bound socket to port 53610 with id = ConnectionManagerId(ip-172-31-10-160.ec2.internal,53610)
  14. 14/05/04 01:55:44 INFO storage.BlockManagerMaster: Trying to register BlockManager
  15. 14/05/04 01:55:44 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager ip-172-31-10-160.ec2.internal:53610 with 819.3 MB RAM
  16. 14/05/04 01:55:44 INFO storage.BlockManagerMaster: Registered BlockManager
  17. 14/05/04 01:55:44 INFO spark.HttpServer: Starting HTTP Server
  18. 14/05/04 01:55:44 INFO server.Server: jetty-7.6.8.v20121106
  19. 14/05/04 01:55:44 INFO server.AbstractConnector: Started [email protected]:33786
  20. 14/05/04 01:55:44 INFO broadcast.HttpBroadcast: Broadcast server started at http://172.31.10.160:33786
  21. 14/05/04 01:55:44 INFO spark.SparkEnv: Registering MapOutputTracker
  22. 14/05/04 01:55:44 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-1d1a3ac0-4d69-4c06-bf10-d218437d4240
  23. 14/05/04 01:55:44 INFO spark.HttpServer: Starting HTTP Server
  24. 14/05/04 01:55:44 INFO server.Server: jetty-7.6.8.v20121106
  25. 14/05/04 01:55:44 INFO server.AbstractConnector: Started [email protected]:54526
  26. 14/05/04 01:55:45 INFO server.Server: jetty-7.6.8.v20121106
  27. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage/rdd,null}
  28. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/storage,null}
  29. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/stage,null}
  30. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages/pool,null}
  31. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/stages,null}
  32. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/environment,null}
  33. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/executors,null}
  34. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/metrics/json,null}
  35. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/static,null}
  36. 14/05/04 01:55:45 INFO handler.ContextHandler: started o.e.j.s.h.ContextHandler{/,null}
  37. 14/05/04 01:55:45 INFO server.AbstractConnector: Started [email protected]:4040
  38. 14/05/04 01:55:45 INFO ui.SparkUI: Started Spark Web UI at http://ip-172-31-10-160.ec2.internal:4040
  39. 14/05/04 01:55:45 INFO spark.SparkContext: Added JAR target/scala-2.10/org-lda_2.10-1.0.jar at http://172.31.10.160:54526/jars/org-lda_2.10-1.0.jar with timestamp 1399168545528
  40. 14/05/04 01:55:45 INFO client.AppClient$ClientActor: Connecting to master spark://ec2-54-86-18-95.compute-1.amazonaws.com:7077...
  41. 14/05/04 01:55:46 INFO storage.MemoryStore: ensureFreeSpace(32856) called with curMem=0, maxMem=859098316
  42. 14/05/04 01:55:46 INFO storage.MemoryStore: Block broadcast_0 stored as values to memory (estimated size 32.1 KB, free 819.3 MB)
  43. 14/05/04 01:55:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  44. 14/05/04 01:55:46 WARN snappy.LoadSnappy: Snappy native library not loaded
  45. 14/05/04 01:55:46 INFO mapred.FileInputFormat: Total input paths to process : 1
  46. 14/05/04 01:55:46 INFO spark.SparkContext: Starting job: count at LatentDirichletAllocation.scala:38
  47. 14/05/04 01:55:46 INFO scheduler.DAGScheduler: Got job 0 (count at LatentDirichletAllocation.scala:38) with 1 output partitions (allowLocal=false)
  48. 14/05/04 01:55:46 INFO scheduler.DAGScheduler: Final stage: Stage 0 (count at LatentDirichletAllocation.scala:38)
  49. 14/05/04 01:55:46 INFO scheduler.DAGScheduler: Parents of final stage: List()
  50. 14/05/04 01:55:46 INFO scheduler.DAGScheduler: Missing parents: List()
  51. 14/05/04 01:55:46 INFO scheduler.DAGScheduler: Submitting Stage 0 (MappedRDD[1] at textFile at LatentDirichletAllocation.scala:37), which has no missing parents
  52. 14/05/04 01:55:46 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from Stage 0 (MappedRDD[1] at textFile at LatentDirichletAllocation.scala:37)
  53. 14/05/04 01:55:46 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
  54. 14/05/04 01:56:01 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
  55. 14/05/04 01:56:05 INFO client.AppClient$ClientActor: Connecting to master spark://ec2-54-86-18-95.compute-1.amazonaws.com:7077...
  56. 14/05/04 01:56:16 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
  57. 14/05/04 01:56:25 INFO client.AppClient$ClientActor: Connecting to master spark://ec2-54-86-18-95.compute-1.amazonaws.com:7077...
  58. 14/05/04 01:56:31 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
  59. 14/05/04 01:56:45 ERROR client.AppClient$ClientActor: All masters are unresponsive! Giving up.
  60. 14/05/04 01:56:45 ERROR cluster.SparkDeploySchedulerBackend: Spark cluster looks dead, giving up.
  61. 14/05/04 01:56:45 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
  62. 14/05/04 01:56:45 INFO scheduler.DAGScheduler: Failed to run count at LatentDirichletAllocation.scala:38
  63. [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Spark cluster looks down
  64. org.apache.spark.SparkException: Job aborted: Spark cluster looks down
  65. at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)
  66. at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)
  67. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  68. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
  69. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)
  70. at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)
  71. at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)
  72. at scala.Option.foreach(Option.scala:236)
  73. at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)
  74. at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)
  75. at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
  76. at akka.actor.ActorCell.invoke(ActorCell.scala:456)
  77. at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
  78. at akka.dispatch.Mailbox.run(Mailbox.scala:219)
  79. at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
  80. at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
  81. at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
  82. at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
  83. at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
  84. [trace] Stack trace suppressed: run last compile:run for the full output.
  85. 14/05/04 01:56:45 INFO network.ConnectionManager: Selector thread was interrupted!
  86. java.lang.RuntimeException: Nonzero exit code: 1
  87. at scala.sys.package$.error(package.scala:27)
  88. [trace] Stack trace suppressed: run last compile:run for the full output.
  89. [error] (compile:run) Nonzero exit code: 1
  90. [error] Total time: 64 s, completed May 4, 2014 1:56:46 AM
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement