Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 10-Mar-2016 20:08:39,067 INFO org.apache.spark.SparkContext:58 - Running Spark version 1.6.0
- 10-Mar-2016 20:08:39,268 WARN org.apache.hadoop.util.NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 10-Mar-2016 20:08:39,442 WARN org.apache.spark.util.Utils:70 - Your hostname, shams-machine resolves to a loopback address: 127.0.1.1; using 192.168.2.103 instead (on interface eth0)
- 10-Mar-2016 20:08:39,443 WARN org.apache.spark.util.Utils:70 - Set SPARK_LOCAL_IP if you need to bind to another address
- 10-Mar-2016 20:08:39,459 INFO org.apache.spark.SecurityManager:58 - Changing view acls to: shams
- 10-Mar-2016 20:08:39,459 INFO org.apache.spark.SecurityManager:58 - Changing modify acls to: shams
- 10-Mar-2016 20:08:39,460 INFO org.apache.spark.SecurityManager:58 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(shams); users with modify permissions: Set(shams)
- 10-Mar-2016 20:08:39,845 INFO org.apache.spark.util.Utils:58 - Successfully started service 'sparkDriver' on port 41856.
- 10-Mar-2016 20:08:40,103 INFO akka.event.slf4j.Slf4jLogger:80 - Slf4jLogger started
- 10-Mar-2016 20:08:40,138 INFO Remoting:74 - Starting remoting
- 10-Mar-2016 20:08:40,271 INFO Remoting:74 - Remoting started; listening on addresses :[akka.tcp://[email protected]:55750]
- 10-Mar-2016 20:08:40,277 INFO org.apache.spark.util.Utils:58 - Successfully started service 'sparkDriverActorSystem' on port 55750.
- 10-Mar-2016 20:08:40,289 INFO org.apache.spark.SparkEnv:58 - Registering MapOutputTracker
- 10-Mar-2016 20:08:40,307 INFO org.apache.spark.SparkEnv:58 - Registering BlockManagerMaster
- 10-Mar-2016 20:08:40,322 INFO org.apache.spark.storage.DiskBlockManager:58 - Created local directory at /tmp/blockmgr-ed1c2b65-34ad-49ac-a032-4766c0bd34d2
- 10-Mar-2016 20:08:40,330 INFO org.apache.spark.storage.MemoryStore:58 - MemoryStore started with capacity 1087.1 MB
- 10-Mar-2016 20:08:40,393 INFO org.apache.spark.SparkEnv:58 - Registering OutputCommitCoordinator
- 10-Mar-2016 20:08:40,713 INFO org.spark-project.jetty.server.Server:272 - jetty-8.y.z-SNAPSHOT
- 10-Mar-2016 20:08:40,754 INFO org.spark-project.jetty.server.AbstractConnector:338 - Started [email protected]:4040
- 10-Mar-2016 20:08:40,755 INFO org.apache.spark.util.Utils:58 - Successfully started service 'SparkUI' on port 4040.
- 10-Mar-2016 20:08:40,757 INFO org.apache.spark.ui.SparkUI:58 - Started SparkUI at http://192.168.2.103:4040
- 10-Mar-2016 20:08:40,859 INFO org.apache.spark.deploy.client.AppClient$ClientEndpoint:58 - Connecting to master spark://shams-machine:7077...
- 10-Mar-2016 20:08:41,162 INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend:58 - Connected to Spark cluster with app ID app-20160310200841-0000
- 10-Mar-2016 20:08:41,169 INFO org.apache.spark.util.Utils:58 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 47151.
- 10-Mar-2016 20:08:41,170 INFO org.apache.spark.network.netty.NettyBlockTransferService:58 - Server created on 47151
- 10-Mar-2016 20:08:41,171 INFO org.apache.spark.storage.BlockManagerMaster:58 - Trying to register BlockManager
- 10-Mar-2016 20:08:41,175 INFO org.apache.spark.storage.BlockManagerMasterEndpoint:58 - Registering block manager 192.168.2.103:47151 with 1087.1 MB RAM, BlockManagerId(driver, 192.168.2.103, 47151)
- 10-Mar-2016 20:08:41,177 INFO org.apache.spark.storage.BlockManagerMaster:58 - Registered BlockManager
- 10-Mar-2016 20:08:41,360 INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend:58 - SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
- 10-Mar-2016 20:08:41,852 INFO org.apache.spark.SparkContext:58 - Starting job: start at EmailCombinerRealtime.java:162
- 10-Mar-2016 20:08:41,874 INFO org.apache.spark.scheduler.DAGScheduler:58 - Registering RDD 1 (start at EmailCombinerRealtime.java:162)
- 10-Mar-2016 20:08:41,876 INFO org.apache.spark.scheduler.DAGScheduler:58 - Got job 0 (start at EmailCombinerRealtime.java:162) with 20 output partitions
- 10-Mar-2016 20:08:41,877 INFO org.apache.spark.scheduler.DAGScheduler:58 - Final stage: ResultStage 1 (start at EmailCombinerRealtime.java:162)
- 10-Mar-2016 20:08:41,877 INFO org.apache.spark.scheduler.DAGScheduler:58 - Parents of final stage: List(ShuffleMapStage 0)
- 10-Mar-2016 20:08:41,879 INFO org.apache.spark.scheduler.DAGScheduler:58 - Missing parents: List(ShuffleMapStage 0)
- 10-Mar-2016 20:08:41,890 INFO org.apache.spark.scheduler.DAGScheduler:58 - Submitting ShuffleMapStage 0 (MapPartitionsRDD[1] at start at EmailCombinerRealtime.java:162), which has no missing parents
- 10-Mar-2016 20:08:42,024 INFO org.apache.spark.storage.MemoryStore:58 - Block broadcast_0 stored as values in memory (estimated size 2.7 KB, free 2.7 KB)
- 10-Mar-2016 20:08:42,041 INFO org.apache.spark.storage.MemoryStore:58 - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1704.0 B, free 4.4 KB)
- 10-Mar-2016 20:08:42,043 INFO org.apache.spark.storage.BlockManagerInfo:58 - Added broadcast_0_piece0 in memory on 192.168.2.103:47151 (size: 1704.0 B, free: 1087.1 MB)
- 10-Mar-2016 20:08:42,046 INFO org.apache.spark.SparkContext:58 - Created broadcast 0 from broadcast at DAGScheduler.scala:1006
- 10-Mar-2016 20:08:42,051 INFO org.apache.spark.scheduler.DAGScheduler:58 - Submitting 50 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[1] at start at EmailCombinerRealtime.java:162)
- 10-Mar-2016 20:08:42,053 INFO org.apache.spark.scheduler.TaskSchedulerImpl:58 - Adding task set 0.0 with 50 tasks
- 10-Mar-2016 20:08:57,068 WARN org.apache.spark.scheduler.TaskSchedulerImpl:70 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
- ensure that workers are registered and have sufficient resources
- 10-Mar-2016 20:09:27,068 WARN org.apache.spark.scheduler.TaskSchedulerImpl:70 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
- 10-Mar-2016 20:09:42,069 WARN org.apache.spark.scheduler.TaskSchedulerImpl:70 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
- 10-Mar-2016 20:09:57,068 WARN org.apache.spark.scheduler.TaskSchedulerImpl:70 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
- 10-Mar-2016 20:10:12,068 WARN org.apache.spark.scheduler.TaskSchedulerImpl:70 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
- 10-Mar-2016 20:10:27,068 WARN org.apache.spark.scheduler.TaskSchedulerImpl:70 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
- 10-Mar-2016 20:10:42,069 WARN org.apache.spark.scheduler.TaskSchedulerImpl:70 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
Advertisement
Add Comment
Please, Sign In to add comment