Advertisement
Guest User

Untitled

a guest
Jun 18th, 2015
58
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 46.12 KB | None | 0 0
  1.  
  2. br@debian-jessie:~/spark-1.4.0-bin-hadoop1/bin$ ./spark-shell
  3. log4j:WARN No appenders could be found for logger (org.apache.hadoop.conf.Configuration).
  4. log4j:WARN Please initialize the log4j system properly.
  5. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
  6. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  7. 15/06/18 13:45:02 INFO SecurityManager: Changing view acls to: br
  8. 15/06/18 13:45:02 INFO SecurityManager: Changing modify acls to: br
  9. 15/06/18 13:45:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(br); users with modify permissions: Set(br)
  10. 15/06/18 13:45:02 INFO HttpServer: Starting HTTP Server
  11. 15/06/18 13:45:02 INFO Utils: Successfully started service 'HTTP class server' on port 38592.
  12. Welcome to
  13. ____ __
  14. / __/__ ___ _____/ /__
  15. _\ \/ _ \/ _ `/ __/ '_/
  16. /___/ .__/\_,_/_/ /_/\_\ version 1.4.0
  17. /_/
  18.  
  19. Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
  20. Type in expressions to have them evaluated.
  21. Type :help for more information.
  22. 15/06/18 13:45:05 INFO SparkContext: Running Spark version 1.4.0
  23. 15/06/18 13:45:05 WARN SparkConf:
  24. SPARK_CLASSPATH was detected (set to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:').
  25. This is deprecated in Spark 1.0+.
  26.  
  27. Please instead use:
  28. - ./spark-submit with --driver-class-path to augment the driver classpath
  29. - spark.executor.extraClassPath to augment the executor classpath
  30.  
  31. 15/06/18 13:45:05 WARN SparkConf: Setting 'spark.executor.extraClassPath' to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:' as a work-around.
  32. 15/06/18 13:45:05 WARN SparkConf: Setting 'spark.driver.extraClassPath' to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:' as a work-around.
  33. 15/06/18 13:45:05 INFO SecurityManager: Changing view acls to: br
  34. 15/06/18 13:45:05 INFO SecurityManager: Changing modify acls to: br
  35. 15/06/18 13:45:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(br); users with modify permissions: Set(br)
  36. 15/06/18 13:45:05 INFO Slf4jLogger: Slf4jLogger started
  37. 15/06/18 13:45:05 INFO Remoting: Starting remoting
  38. 15/06/18 13:45:05 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:50650]
  39. 15/06/18 13:45:05 INFO Utils: Successfully started service 'sparkDriver' on port 50650.
  40. 15/06/18 13:45:05 INFO SparkEnv: Registering MapOutputTracker
  41. 15/06/18 13:45:05 INFO SparkEnv: Registering BlockManagerMaster
  42. 15/06/18 13:45:06 INFO DiskBlockManager: Created local directory at /tmp/spark-7058feff-0c15-4af6-aeaf-47a21ec594ec/blockmgr-7c0070f8-0534-4156-963f-af56c5326a70
  43. 15/06/18 13:45:06 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
  44. 15/06/18 13:45:06 INFO HttpFileServer: HTTP File server directory is /tmp/spark-7058feff-0c15-4af6-aeaf-47a21ec594ec/httpd-7d913234-bf93-434f-a351-09abc2a45dc0
  45. 15/06/18 13:45:06 INFO HttpServer: Starting HTTP Server
  46. 15/06/18 13:45:06 INFO Utils: Successfully started service 'HTTP file server' on port 35480.
  47. 15/06/18 13:45:06 INFO SparkEnv: Registering OutputCommitCoordinator
  48. 15/06/18 13:45:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
  49. 15/06/18 13:45:06 INFO SparkUI: Started SparkUI at http://192.168.168.130:4040
  50. 15/06/18 13:45:06 INFO Executor: Starting executor ID driver on host localhost
  51. 15/06/18 13:45:06 INFO Executor: Using REPL class URI: http://192.168.168.130:38592
  52. 15/06/18 13:45:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 49412.
  53. 15/06/18 13:45:06 INFO NettyBlockTransferService: Server created on 49412
  54. 15/06/18 13:45:06 INFO BlockManagerMaster: Trying to register BlockManager
  55. 15/06/18 13:45:06 INFO BlockManagerMasterEndpoint: Registering block manager localhost:49412 with 265.1 MB RAM, BlockManagerId(driver, localhost, 49412)
  56. 15/06/18 13:45:06 INFO BlockManagerMaster: Registered BlockManager
  57. 15/06/18 13:45:06 INFO SparkILoop: Created spark context..
  58. Spark context available as sc.
  59. 15/06/18 13:45:06 INFO HiveContext: Initializing execution hive, version 0.13.1
  60. 15/06/18 13:45:07 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
  61. 15/06/18 13:45:07 INFO ObjectStore: ObjectStore, initialize called
  62. 15/06/18 13:45:07 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
  63. 15/06/18 13:45:07 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
  64. 15/06/18 13:45:07 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
  65. 15/06/18 13:45:07 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
  66. 15/06/18 13:45:08 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
  67. 15/06/18 13:45:08 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "".
  68. 15/06/18 13:45:09 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  69. 15/06/18 13:45:09 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  70. 15/06/18 13:45:10 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  71. 15/06/18 13:45:10 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  72. 15/06/18 13:45:10 INFO ObjectStore: Initialized ObjectStore
  73. 15/06/18 13:45:10 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
  74. 15/06/18 13:45:10 INFO HiveMetaStore: Added admin role in metastore
  75. 15/06/18 13:45:10 INFO HiveMetaStore: Added public role in metastore
  76. 15/06/18 13:45:10 INFO HiveMetaStore: No user is added in admin role, since config is empty
  77. 15/06/18 13:45:10 INFO SessionState: No Tez session required at this point. hive.execution.engine=mr.
  78. 15/06/18 13:45:10 INFO SparkILoop: Created sql context (with Hive support)..
  79. SQL context available as sqlContext.
  80.  
  81. scala> 15/06/18 13:45:15 INFO SparkContext: Invoking stop() from shutdown hook
  82. 15/06/18 13:45:15 INFO SparkUI: Stopped Spark web UI at http://192.168.168.130:4040
  83. 15/06/18 13:45:15 INFO DAGScheduler: Stopping DAGScheduler
  84. 15/06/18 13:45:15 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  85. 15/06/18 13:45:15 INFO Utils: path = /tmp/spark-7058feff-0c15-4af6-aeaf-47a21ec594ec/blockmgr-7c0070f8-0534-4156-963f-af56c5326a70, already present as root for deletion.
  86. 15/06/18 13:45:15 INFO MemoryStore: MemoryStore cleared
  87. 15/06/18 13:45:15 INFO BlockManager: BlockManager stopped
  88. 15/06/18 13:45:15 INFO BlockManagerMaster: BlockManagerMaster stopped
  89. 15/06/18 13:45:15 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  90. 15/06/18 13:45:15 INFO SparkContext: Successfully stopped SparkContext
  91. 15/06/18 13:45:15 INFO Utils: Shutdown hook called
  92. 15/06/18 13:45:15 INFO Utils: Deleting directory /tmp/spark-7058feff-0c15-4af6-aeaf-47a21ec594ec
  93. 15/06/18 13:45:15 INFO Utils: Deleting directory /tmp/spark-3e878219-2a0d-4843-8dbd-f1205ce43bd2
  94. 15/06/18 13:45:15 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  95. 15/06/18 13:45:15 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
  96. 15/06/18 13:45:15 INFO Utils: Deleting directory /tmp/spark-ed5c9445-63cb-46e5-be13-18cbb4352f79
  97. br@debian-jessie:~/spark-1.4.0-bin-hadoop1/bin$ export SPARK_CLASSPATH="$(echo /home/br/spark-1.4.0-bin-hadoop1/lib/addons/*.jar |sed 's/ /:/g'):"
  98. br@debian-jessie:~/spark-1.4.0-bin-hadoop1/bin$ ./spark-shell
  99. log4j:WARN No appenders could be found for logger (org.apache.hadoop.conf.Configuration).
  100. log4j:WARN Please initialize the log4j system properly.
  101. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
  102. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  103. 15/06/18 13:45:20 INFO SecurityManager: Changing view acls to: br
  104. 15/06/18 13:45:20 INFO SecurityManager: Changing modify acls to: br
  105. 15/06/18 13:45:20 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(br); users with modify permissions: Set(br)
  106. 15/06/18 13:45:20 INFO HttpServer: Starting HTTP Server
  107. 15/06/18 13:45:20 INFO Utils: Successfully started service 'HTTP class server' on port 53798.
  108. Welcome to
  109. ____ __
  110. / __/__ ___ _____/ /__
  111. _\ \/ _ \/ _ `/ __/ '_/
  112. /___/ .__/\_,_/_/ /_/\_\ version 1.4.0
  113. /_/
  114.  
  115. Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
  116. Type in expressions to have them evaluated.
  117. Type :help for more information.
  118. 15/06/18 13:45:22 INFO SparkContext: Running Spark version 1.4.0
  119. 15/06/18 13:45:22 WARN SparkConf:
  120. SPARK_CLASSPATH was detected (set to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:').
  121. This is deprecated in Spark 1.0+.
  122.  
  123. Please instead use:
  124. - ./spark-submit with --driver-class-path to augment the driver classpath
  125. - spark.executor.extraClassPath to augment the executor classpath
  126.  
  127. 15/06/18 13:45:22 WARN SparkConf: Setting 'spark.executor.extraClassPath' to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:' as a work-around.
  128. 15/06/18 13:45:22 WARN SparkConf: Setting 'spark.driver.extraClassPath' to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:' as a work-around.
  129. 15/06/18 13:45:22 INFO SecurityManager: Changing view acls to: br
  130. 15/06/18 13:45:22 INFO SecurityManager: Changing modify acls to: br
  131. 15/06/18 13:45:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(br); users with modify permissions: Set(br)
  132. 15/06/18 13:45:22 INFO Slf4jLogger: Slf4jLogger started
  133. 15/06/18 13:45:22 INFO Remoting: Starting remoting
  134. 15/06/18 13:45:23 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:41375]
  135. 15/06/18 13:45:23 INFO Utils: Successfully started service 'sparkDriver' on port 41375.
  136. 15/06/18 13:45:23 INFO SparkEnv: Registering MapOutputTracker
  137. 15/06/18 13:45:23 INFO SparkEnv: Registering BlockManagerMaster
  138. 15/06/18 13:45:23 INFO DiskBlockManager: Created local directory at /tmp/spark-97102888-de72-4084-b2b5-494b1c630545/blockmgr-bccea169-b161-43b3-bb60-85027a9b0f70
  139. 15/06/18 13:45:23 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
  140. 15/06/18 13:45:23 INFO HttpFileServer: HTTP File server directory is /tmp/spark-97102888-de72-4084-b2b5-494b1c630545/httpd-fa627bcc-f130-4881-a8e9-9f5b85c8aca6
  141. 15/06/18 13:45:23 INFO HttpServer: Starting HTTP Server
  142. 15/06/18 13:45:23 INFO Utils: Successfully started service 'HTTP file server' on port 48632.
  143. 15/06/18 13:45:23 INFO SparkEnv: Registering OutputCommitCoordinator
  144. 15/06/18 13:45:23 INFO Utils: Successfully started service 'SparkUI' on port 4040.
  145. 15/06/18 13:45:23 INFO SparkUI: Started SparkUI at http://192.168.168.130:4040
  146. 15/06/18 13:45:23 INFO Executor: Starting executor ID driver on host localhost
  147. 15/06/18 13:45:23 INFO Executor: Using REPL class URI: http://192.168.168.130:53798
  148. 15/06/18 13:45:23 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59971.
  149. 15/06/18 13:45:23 INFO NettyBlockTransferService: Server created on 59971
  150. 15/06/18 13:45:23 INFO BlockManagerMaster: Trying to register BlockManager
  151. 15/06/18 13:45:23 INFO BlockManagerMasterEndpoint: Registering block manager localhost:59971 with 265.1 MB RAM, BlockManagerId(driver, localhost, 59971)
  152. 15/06/18 13:45:23 INFO BlockManagerMaster: Registered BlockManager
  153. 15/06/18 13:45:23 INFO SparkILoop: Created spark context..
  154. Spark context available as sc.
  155. 15/06/18 13:45:24 INFO HiveContext: Initializing execution hive, version 0.13.1
  156. 15/06/18 13:45:24 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
  157. 15/06/18 13:45:24 INFO ObjectStore: ObjectStore, initialize called
  158. 15/06/18 13:45:24 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
  159. 15/06/18 13:45:24 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
  160. 15/06/18 13:45:24 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
  161. 15/06/18 13:45:25 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
  162. 15/06/18 13:45:25 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
  163. 15/06/18 13:45:26 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "".
  164. 15/06/18 13:45:26 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  165. 15/06/18 13:45:26 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  166. 15/06/18 13:45:27 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  167. 15/06/18 13:45:27 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  168. 15/06/18 13:45:27 INFO ObjectStore: Initialized ObjectStore
  169. 15/06/18 13:45:27 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
  170. 15/06/18 13:45:27 INFO HiveMetaStore: Added admin role in metastore
  171. 15/06/18 13:45:27 INFO HiveMetaStore: Added public role in metastore
  172. 15/06/18 13:45:27 INFO HiveMetaStore: No user is added in admin role, since config is empty
  173. 15/06/18 13:45:27 INFO SessionState: No Tez session required at this point. hive.execution.engine=mr.
  174. 15/06/18 13:45:27 INFO SparkILoop: Created sql context (with Hive support)..
  175. SQL context available as sqlContext.
  176.  
  177. scala> sc.stop
  178. 15/06/18 13:45:30 INFO SparkUI: Stopped Spark web UI at http://192.168.168.130:4040
  179. 15/06/18 13:45:30 INFO DAGScheduler: Stopping DAGScheduler
  180. 15/06/18 13:45:30 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  181. 15/06/18 13:45:30 INFO Utils: path = /tmp/spark-97102888-de72-4084-b2b5-494b1c630545/blockmgr-bccea169-b161-43b3-bb60-85027a9b0f70, already present as root for deletion.
  182. 15/06/18 13:45:30 INFO MemoryStore: MemoryStore cleared
  183. 15/06/18 13:45:30 INFO BlockManager: BlockManager stopped
  184. 15/06/18 13:45:30 INFO BlockManagerMaster: BlockManagerMaster stopped
  185. 15/06/18 13:45:30 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  186. 15/06/18 13:45:30 INFO SparkContext: Successfully stopped SparkContext
  187.  
  188. scala> imp15/06/18 13:45:30 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  189. 15/06/18 13:45:30 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
  190. ort org.apache.spark.SparkConf
  191. 15/06/18 13:45:30 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
  192. import org.apache.spark.SparkConf
  193.  
  194. scala> import org.apache.spark.SparkContext._
  195. import org.apache.spark.SparkContext._
  196.  
  197. scala> import kafka.serializer.DefaultDecoder
  198. import kafka.serializer.DefaultDecoder
  199.  
  200. scala> import org.apache.spark.streaming._
  201. import org.apache.spark.streaming._
  202.  
  203. scala> import org.apache.spark.streaming.kafka._
  204. import org.apache.spark.streaming.kafka._
  205.  
  206. scala> import org.apache.spark.storage.StorageLevel
  207. import org.apache.spark.storage.StorageLevel
  208.  
  209. scala> val sparkConf = new SparkConf().setAppName("Summarizer").setMaster("local")
  210. sparkConf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@4537c9f8
  211.  
  212. scala> val ssc = new StreamingContext(sparkConf, Seconds(10))
  213. 15/06/18 13:45:32 INFO SparkContext: Running Spark version 1.4.0
  214. 15/06/18 13:45:32 WARN SparkConf:
  215. SPARK_CLASSPATH was detected (set to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:').
  216. This is deprecated in Spark 1.0+.
  217.  
  218. Please instead use:
  219. - ./spark-submit with --driver-class-path to augment the driver classpath
  220. - spark.executor.extraClassPath to augment the executor classpath
  221.  
  222. 15/06/18 13:45:32 WARN SparkConf: Setting 'spark.executor.extraClassPath' to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:' as a work-around.
  223. 15/06/18 13:45:32 WARN SparkConf: Setting 'spark.driver.extraClassPath' to '/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:' as a work-around.
  224. 15/06/18 13:45:32 INFO SecurityManager: Changing view acls to: br
  225. 15/06/18 13:45:32 INFO SecurityManager: Changing modify acls to: br
  226. 15/06/18 13:45:32 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(br); users with modify permissions: Set(br)
  227. 15/06/18 13:45:32 INFO Slf4jLogger: Slf4jLogger started
  228. 15/06/18 13:45:32 INFO Remoting: Starting remoting
  229. 15/06/18 13:45:32 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:52075]
  230. 15/06/18 13:45:32 INFO Utils: Successfully started service 'sparkDriver' on port 52075.
  231. 15/06/18 13:45:32 INFO SparkEnv: Registering MapOutputTracker
  232. 15/06/18 13:45:32 INFO SparkEnv: Registering BlockManagerMaster
  233. 15/06/18 13:45:32 INFO DiskBlockManager: Created local directory at /tmp/spark-97102888-de72-4084-b2b5-494b1c630545/blockmgr-05b26ff9-10d2-4fa7-a1d7-e7a76dabdc34
  234. 15/06/18 13:45:32 INFO MemoryStore: MemoryStore started with capacity 247.3 MB
  235. 15/06/18 13:45:32 INFO HttpFileServer: HTTP File server directory is /tmp/spark-97102888-de72-4084-b2b5-494b1c630545/httpd-a86723a3-1932-4e4b-a641-e6fb484e2594
  236. 15/06/18 13:45:32 INFO HttpServer: Starting HTTP Server
  237. 15/06/18 13:45:32 INFO Utils: Successfully started service 'HTTP file server' on port 44790.
  238. 15/06/18 13:45:32 INFO SparkEnv: Registering OutputCommitCoordinator
  239. 15/06/18 13:45:32 INFO Utils: Successfully started service 'SparkUI' on port 4040.
  240. 15/06/18 13:45:32 INFO SparkUI: Started SparkUI at http://localhost:4040
  241. 15/06/18 13:45:32 INFO Executor: Starting executor ID driver on host localhost
  242. 15/06/18 13:45:32 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 49443.
  243. 15/06/18 13:45:32 INFO NettyBlockTransferService: Server created on 49443
  244. 15/06/18 13:45:32 INFO BlockManagerMaster: Trying to register BlockManager
  245. 15/06/18 13:45:32 INFO BlockManagerMasterEndpoint: Registering block manager localhost:49443 with 247.3 MB RAM, BlockManagerId(driver, localhost, 49443)
  246. 15/06/18 13:45:32 INFO BlockManagerMaster: Registered BlockManager
  247. 15/06/18 13:45:32 WARN StreamingContext: spark.master should be set as local[n], n > 1 in local mode if you have receivers to get data, otherwise Spark jobs will not get resources to process the received data.
  248. ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@4fe03c06
  249.  
  250. scala> val kafkaParams = Map[String, String]("zookeeper.connect" -> "127.0.0.1:2181", "group.id" -> "test")
  251. kafkaParams: scala.collection.immutable.Map[String,String] = Map(zookeeper.connect -> 127.0.0.1:2181, group.id -> test)
  252.  
  253. scala> val messages = KafkaUtils.createStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder](ssc, kafkaParams, Map("test" -> 1), StorageLevel.MEMORY_ONLY_SER).map(_._2)
  254. messages: org.apache.spark.streaming.dstream.DStream[Array[Byte]] = org.apache.spark.streaming.dstream.MappedDStream@10b40e87
  255.  
  256. scala>
  257.  
  258. scala> messages.foreachRDD { pairRDD =>
  259. | println(s"DataListener.listen() [pairRDD = ${pairRDD}]")
  260. | println(s"DataListener.listen() [pairRDD.count = ${pairRDD.count()}]")
  261. | pairRDD.foreach(row => println(s"DataListener.listen() [row = ${row}]"))
  262. | }
  263.  
  264. scala>
  265.  
  266. scala> ssc.start()
  267. 15/06/18 13:45:34 INFO ReceiverTracker: ReceiverTracker started
  268. 15/06/18 13:45:34 INFO ForEachDStream: metadataCleanupDelay = -1
  269. 15/06/18 13:45:34 INFO MappedDStream: metadataCleanupDelay = -1
  270. 15/06/18 13:45:34 INFO KafkaInputDStream: metadataCleanupDelay = -1
  271. 15/06/18 13:45:34 INFO KafkaInputDStream: Slide time = 10000 ms
  272. 15/06/18 13:45:34 INFO KafkaInputDStream: Storage level = StorageLevel(false, false, false, false, 1)
  273. 15/06/18 13:45:34 INFO KafkaInputDStream: Checkpoint interval = null
  274. 15/06/18 13:45:34 INFO KafkaInputDStream: Remember duration = 10000 ms
  275. 15/06/18 13:45:34 INFO KafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.KafkaInputDStream@2e00032d
  276. 15/06/18 13:45:34 INFO MappedDStream: Slide time = 10000 ms
  277. 15/06/18 13:45:34 INFO MappedDStream: Storage level = StorageLevel(false, false, false, false, 1)
  278. 15/06/18 13:45:34 INFO MappedDStream: Checkpoint interval = null
  279. 15/06/18 13:45:34 INFO MappedDStream: Remember duration = 10000 ms
  280. 15/06/18 13:45:34 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@10b40e87
  281. 15/06/18 13:45:34 INFO ForEachDStream: Slide time = 10000 ms
  282. 15/06/18 13:45:34 INFO ForEachDStream: Storage level = StorageLevel(false, false, false, false, 1)
  283. 15/06/18 13:45:34 INFO ForEachDStream: Checkpoint interval = null
  284. 15/06/18 13:45:34 INFO ForEachDStream: Remember duration = 10000 ms
  285. 15/06/18 13:45:34 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@3934574f
  286. 15/06/18 13:45:34 INFO ReceiverTracker: Starting 1 receivers
  287. 15/06/18 13:45:34 INFO SparkContext: Starting job: start at <console>:36
  288. 15/06/18 13:45:34 INFO DAGScheduler: Got job 0 (start at <console>:36) with 1 output partitions (allowLocal=false)
  289. 15/06/18 13:45:34 INFO DAGScheduler: Final stage: ResultStage 0(start at <console>:36)
  290. 15/06/18 13:45:34 INFO DAGScheduler: Parents of final stage: List()
  291. 15/06/18 13:45:34 INFO DAGScheduler: Missing parents: List()
  292. 15/06/18 13:45:34 INFO DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at start at <console>:36), which has no missing parents
  293. 15/06/18 13:45:34 INFO RecurringTimer: Started timer for JobGenerator at time 1434627940000
  294. 15/06/18 13:45:34 INFO JobGenerator: Started JobGenerator at 1434627940000 ms
  295. 15/06/18 13:45:34 INFO JobScheduler: Started JobScheduler
  296. 15/06/18 13:45:34 INFO StreamingContext: StreamingContext started
  297.  
  298. scala> ssc.awaitTermination()
  299. 15/06/18 13:45:34 INFO MemoryStore: ensureFreeSpace(7952) called with curMem=0, maxMem=259333816
  300. 15/06/18 13:45:34 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 7.8 KB, free 247.3 MB)
  301. 15/06/18 13:45:34 INFO MemoryStore: ensureFreeSpace(4451) called with curMem=7952, maxMem=259333816
  302. 15/06/18 13:45:34 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.3 KB, free 247.3 MB)
  303. 15/06/18 13:45:34 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:49443 (size: 4.3 KB, free: 247.3 MB)
  304. 15/06/18 13:45:34 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:874
  305. 15/06/18 13:45:34 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at start at <console>:36)
  306. 15/06/18 13:45:34 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
  307. 15/06/18 13:45:34 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 2313 bytes)
  308. 15/06/18 13:45:34 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
  309. 15/06/18 13:45:35 INFO RecurringTimer: Started timer for BlockGenerator at time 1434627935200
  310. 15/06/18 13:45:35 INFO BlockGenerator: Started BlockGenerator
  311. 15/06/18 13:45:35 INFO ReceiverSupervisorImpl: Starting receiver
  312. 15/06/18 13:45:35 INFO KafkaReceiver: Starting Kafka Consumer Stream with group: test
  313. 15/06/18 13:45:35 INFO KafkaReceiver: Connecting to Zookeeper: 127.0.0.1:2181
  314. 15/06/18 13:45:35 INFO BlockGenerator: Started block pushing thread
  315. 15/06/18 13:45:35 INFO VerifiableProperties: Verifying properties
  316. 15/06/18 13:45:35 INFO VerifiableProperties: Property group.id is overridden to test
  317. 15/06/18 13:45:35 INFO VerifiableProperties: Property zookeeper.connect is overridden to 127.0.0.1:2181
  318. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Connecting to zookeeper instance at 127.0.0.1:2181
  319. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
  320. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:host.name=debian-jessie
  321. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:java.version=1.8.0_45
  322. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:java.vendor=Oracle Corporation
  323. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:java.home=/usr/lib/jvm/java-8-oracle/jre
  324. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:java.class.path=/home/br/spark-1.4.0-bin-hadoop1/lib/addons/kafka_2.10-0.8.1.1.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/metrics-core-2.2.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/spark-streaming-kafka_2.10-1.4.0.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/addons/zkclient-0.5.jar:/home/br/spark-1.4.0-bin-hadoop1/conf/:/home/br/spark-1.4.0-bin-hadoop1/lib/spark-assembly-1.4.0-hadoop1.0.4.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/datanucleus-core-3.2.10.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/datanucleus-api-jdo-3.2.6.jar:/home/br/spark-1.4.0-bin-hadoop1/lib/datanucleus-rdbms-3.2.9.jar
  325. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
  326. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:java.io.tmpdir=/tmp
  327. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:java.compiler=<NA>
  328. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:os.name=Linux
  329. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:os.arch=amd64
  330. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:os.version=3.16.0-4-amd64
  331. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:user.name=br
  332. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:user.home=/home/br
  333. 15/06/18 13:45:35 INFO ZooKeeper: Client environment:user.dir=/home/br/spark-1.4.0-bin-hadoop1/bin
  334. 15/06/18 13:45:35 INFO ZooKeeper: Initiating client connection, connectString=127.0.0.1:2181 sessionTimeout=6000 watcher=org.I0Itec.zkclient.ZkClient@277025ec
  335. 15/06/18 13:45:35 INFO ZkEventThread: Starting ZkClient event thread.
  336. 15/06/18 13:45:35 INFO ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
  337. 15/06/18 13:45:35 INFO ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session
  338. 15/06/18 13:45:35 WARN ClientCnxnSocket: Connected to an old server; r-o mode will be unavailable
  339. 15/06/18 13:45:35 INFO ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x14e067c68f10001, negotiated timeout = 6000
  340. 15/06/18 13:45:35 INFO ZkClient: zookeeper state changed (SyncConnected)
  341. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], starting auto committer every 60000 ms
  342. 15/06/18 13:45:35 INFO KafkaReceiver: Connected to 127.0.0.1:2181
  343. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], begin registering consumer test_debian-jessie-1434627935036-fa2d3110 in ZK
  344. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], end registering consumer test_debian-jessie-1434627935036-fa2d3110 in ZK
  345. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], starting watcher executor thread for consumer test_debian-jessie-1434627935036-fa2d3110
  346. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], begin rebalancing consumer test_debian-jessie-1434627935036-fa2d3110 try #0
  347. 15/06/18 13:45:35 INFO ConsumerFetcherManager: [ConsumerFetcherManager-1434627935098] Stopping leader finder thread
  348. 15/06/18 13:45:35 INFO ConsumerFetcherManager: [ConsumerFetcherManager-1434627935098] Stopping all fetchers
  349. 15/06/18 13:45:35 INFO ConsumerFetcherManager: [ConsumerFetcherManager-1434627935098] All connections stopped
  350. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Cleared all relevant queues for this fetcher
  351. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Cleared the data chunks in all the consumer message iterators
  352. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Committing all offsets after clearing the fetcher queues
  353. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Releasing partition ownership
  354. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Consumer test_debian-jessie-1434627935036-fa2d3110 rebalancing the following partitions: ArrayBuffer(0) for topic test with consumers: List(test_debian-jessie-1434627935036-fa2d3110-0)
  355. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], test_debian-jessie-1434627935036-fa2d3110-0 attempting to claim partition 0
  356. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], test_debian-jessie-1434627935036-fa2d3110-0 successfully owned partition 0 for topic test
  357. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Updating the cache
  358. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], Consumer test_debian-jessie-1434627935036-fa2d3110 selected partitions : test:0: fetched offset = 104349: consumed offset = 104349
  359. 15/06/18 13:45:35 INFO ZookeeperConsumerConnector: [test_debian-jessie-1434627935036-fa2d3110], end rebalancing consumer test_debian-jessie-1434627935036-fa2d3110 try #0
  360. 15/06/18 13:45:35 INFO ReceiverSupervisorImpl: Called receiver onStart
  361. 15/06/18 13:45:35 INFO ReceiverTracker: Registered receiver for stream 0 from localhost:52075
  362. 15/06/18 13:45:35 INFO KafkaReceiver: Starting MessageHandler.
  363. 15/06/18 13:45:35 INFO ConsumerFetcherManager$LeaderFinderThread: [test_debian-jessie-1434627935036-fa2d3110-leader-finder-thread], Starting
  364. 15/06/18 13:45:35 INFO VerifiableProperties: Verifying properties
  365. 15/06/18 13:45:35 INFO VerifiableProperties: Property client.id is overridden to test
  366. 15/06/18 13:45:35 INFO VerifiableProperties: Property metadata.broker.list is overridden to debian-jessie:9092
  367. 15/06/18 13:45:35 INFO VerifiableProperties: Property request.timeout.ms is overridden to 30000
  368. 15/06/18 13:45:35 INFO ClientUtils$: Fetching metadata from broker id:0,host:debian-jessie,port:9092 with correlation id 0 for 1 topic(s) Set(test)
  369. 15/06/18 13:45:35 INFO SyncProducer: Connected to debian-jessie:9092 for producing
  370. 15/06/18 13:45:35 INFO SyncProducer: Disconnecting from debian-jessie:9092
  371. 15/06/18 13:45:35 INFO ConsumerFetcherManager: [ConsumerFetcherManager-1434627935098] Added fetcher for partitions ArrayBuffer([[test,0], initOffset 104349 to broker id:0,host:debian-jessie,port:9092] )
  372. 15/06/18 13:45:35 INFO ConsumerFetcherThread: [ConsumerFetcherThread-test_debian-jessie-1434627935036-fa2d3110-0-0], Starting
  373. 15/06/18 13:45:40 INFO JobScheduler: Added jobs for time 1434627940000 ms
  374. 15/06/18 13:45:40 INFO JobScheduler: Starting job streaming job 1434627940000 ms.0 from job set of time 1434627940000 ms
  375. DataListener.listen() [pairRDD = MapPartitionsRDD[2] at map at <console>:37]
  376. 15/06/18 13:45:40 INFO SparkContext: Starting job: foreachRDD at <console>:40
  377. 15/06/18 13:45:40 INFO DAGScheduler: Job 1 finished: foreachRDD at <console>:40, took 0.000688 s
  378. DataListener.listen() [pairRDD.count = 0]
  379. 15/06/18 13:45:40 INFO SparkContext: Starting job: foreachRDD at <console>:40
  380. 15/06/18 13:45:40 INFO DAGScheduler: Job 2 finished: foreachRDD at <console>:40, took 0.000184 s
  381. 15/06/18 13:45:40 INFO JobScheduler: Finished job streaming job 1434627940000 ms.0 from job set of time 1434627940000 ms
  382. 15/06/18 13:45:40 INFO JobScheduler: Total delay: 0.051 s for time 1434627940000 ms (execution: 0.015 s)
  383. 15/06/18 13:45:40 INFO ReceivedBlockTracker: Deleting batches ArrayBuffer()
  384. 15/06/18 13:45:40 INFO InputInfoTracker: remove old batch metadata:
  385. 15/06/18 13:45:44 INFO MemoryStore: ensureFreeSpace(2426) called with curMem=12403, maxMem=259333816
  386. 15/06/18 13:45:44 INFO MemoryStore: Block input-0-1434627943800 stored as bytes in memory (estimated size 2.4 KB, free 247.3 MB)
  387. 15/06/18 13:45:44 INFO BlockManagerInfo: Added input-0-1434627943800 in memory on localhost:49443 (size: 2.4 KB, free: 247.3 MB)
  388. 15/06/18 13:45:44 INFO BlockGenerator: Pushed block input-0-1434627943800
  389. 15/06/18 13:45:44 INFO MemoryStore: ensureFreeSpace(526) called with curMem=14829, maxMem=259333816
  390. 15/06/18 13:45:44 INFO MemoryStore: Block input-0-1434627944000 stored as bytes in memory (estimated size 526.0 B, free 247.3 MB)
  391. 15/06/18 13:45:44 INFO BlockManagerInfo: Added input-0-1434627944000 in memory on localhost:49443 (size: 526.0 B, free: 247.3 MB)
  392. 15/06/18 13:45:44 INFO BlockGenerator: Pushed block input-0-1434627944000
  393. 15/06/18 13:45:44 INFO MemoryStore: ensureFreeSpace(10568) called with curMem=15355, maxMem=259333816
  394. 15/06/18 13:45:44 INFO MemoryStore: Block input-0-1434627944400 stored as bytes in memory (estimated size 10.3 KB, free 247.3 MB)
  395. 15/06/18 13:45:44 INFO BlockManagerInfo: Added input-0-1434627944400 in memory on localhost:49443 (size: 10.3 KB, free: 247.3 MB)
  396. 15/06/18 13:45:44 INFO BlockGenerator: Pushed block input-0-1434627944400
  397. 15/06/18 13:45:44 INFO MemoryStore: ensureFreeSpace(3899) called with curMem=25923, maxMem=259333816
  398. 15/06/18 13:45:44 INFO MemoryStore: Block input-0-1434627944600 stored as bytes in memory (estimated size 3.8 KB, free 247.3 MB)
  399. 15/06/18 13:45:44 INFO BlockManagerInfo: Added input-0-1434627944600 in memory on localhost:49443 (size: 3.8 KB, free: 247.3 MB)
  400. 15/06/18 13:45:44 INFO BlockGenerator: Pushed block input-0-1434627944600
  401. 15/06/18 13:45:45 INFO MemoryStore: ensureFreeSpace(3249) called with curMem=29822, maxMem=259333816
  402. 15/06/18 13:45:45 INFO MemoryStore: Block input-0-1434627944800 stored as bytes in memory (estimated size 3.2 KB, free 247.3 MB)
  403. 15/06/18 13:45:45 INFO BlockManagerInfo: Added input-0-1434627944800 in memory on localhost:49443 (size: 3.2 KB, free: 247.3 MB)
  404. 15/06/18 13:45:45 INFO BlockGenerator: Pushed block input-0-1434627944800
  405. 15/06/18 13:45:45 INFO MemoryStore: ensureFreeSpace(3199) called with curMem=33071, maxMem=259333816
  406. 15/06/18 13:45:45 INFO MemoryStore: Block input-0-1434627945000 stored as bytes in memory (estimated size 3.1 KB, free 247.3 MB)
  407. 15/06/18 13:45:45 INFO BlockManagerInfo: Added input-0-1434627945000 in memory on localhost:49443 (size: 3.1 KB, free: 247.3 MB)
  408. 15/06/18 13:45:45 INFO BlockGenerator: Pushed block input-0-1434627945000
  409. 15/06/18 13:45:45 INFO MemoryStore: ensureFreeSpace(3249) called with curMem=36270, maxMem=259333816
  410. 15/06/18 13:45:45 INFO MemoryStore: Block input-0-1434627945200 stored as bytes in memory (estimated size 3.2 KB, free 247.3 MB)
  411. 15/06/18 13:45:45 INFO BlockManagerInfo: Added input-0-1434627945200 in memory on localhost:49443 (size: 3.2 KB, free: 247.3 MB)
  412. 15/06/18 13:45:45 INFO BlockGenerator: Pushed block input-0-1434627945200
  413. 15/06/18 13:45:45 INFO MemoryStore: ensureFreeSpace(3249) called with curMem=39519, maxMem=259333816
  414. 15/06/18 13:45:45 INFO MemoryStore: Block input-0-1434627945400 stored as bytes in memory (estimated size 3.2 KB, free 247.3 MB)
  415. 15/06/18 13:45:45 INFO BlockManagerInfo: Added input-0-1434627945400 in memory on localhost:49443 (size: 3.2 KB, free: 247.3 MB)
  416. 15/06/18 13:45:45 INFO BlockGenerator: Pushed block input-0-1434627945400
  417. 15/06/18 13:45:45 INFO MemoryStore: ensureFreeSpace(3249) called with curMem=42768, maxMem=259333816
  418. 15/06/18 13:45:45 INFO MemoryStore: Block input-0-1434627945600 stored as bytes in memory (estimated size 3.2 KB, free 247.3 MB)
  419. 15/06/18 13:45:45 INFO BlockManagerInfo: Added input-0-1434627945600 in memory on localhost:49443 (size: 3.2 KB, free: 247.3 MB)
  420. 15/06/18 13:45:45 INFO BlockGenerator: Pushed block input-0-1434627945600
  421. 15/06/18 13:45:46 INFO MemoryStore: ensureFreeSpace(3299) called with curMem=46017, maxMem=259333816
  422. 15/06/18 13:45:46 INFO MemoryStore: Block input-0-1434627945800 stored as bytes in memory (estimated size 3.2 KB, free 247.3 MB)
  423. 15/06/18 13:45:46 INFO BlockManagerInfo: Added input-0-1434627945800 in memory on localhost:49443 (size: 3.2 KB, free: 247.3 MB)
  424. 15/06/18 13:45:46 INFO BlockGenerator: Pushed block input-0-1434627945800
  425. 15/06/18 13:45:46 INFO MemoryStore: ensureFreeSpace(2076) called with curMem=49316, maxMem=259333816
  426. 15/06/18 13:45:46 INFO MemoryStore: Block input-0-1434627946000 stored as bytes in memory (estimated size 2.0 KB, free 247.3 MB)
  427. 15/06/18 13:45:46 INFO BlockManagerInfo: Added input-0-1434627946000 in memory on localhost:49443 (size: 2.0 KB, free: 247.3 MB)
  428. 15/06/18 13:45:46 INFO BlockGenerator: Pushed block input-0-1434627946000
  429. 15/06/18 13:45:50 INFO JobScheduler: Starting job streaming job 1434627950000 ms.0 from job set of time 1434627950000 ms
  430. DataListener.listen() [pairRDD = MapPartitionsRDD[4] at map at <console>:37]
  431. 15/06/18 13:45:50 INFO SparkContext: Starting job: foreachRDD at <console>:40
  432. 15/06/18 13:45:50 INFO JobScheduler: Added jobs for time 1434627950000 ms
  433. 15/06/18 13:45:50 INFO DAGScheduler: Got job 3 (foreachRDD at <console>:40) with 11 output partitions (allowLocal=false)
  434. 15/06/18 13:45:50 INFO DAGScheduler: Final stage: ResultStage 1(foreachRDD at <console>:40)
  435. 15/06/18 13:45:50 INFO DAGScheduler: Parents of final stage: List()
  436. 15/06/18 13:45:50 INFO DAGScheduler: Missing parents: List()
  437. 15/06/18 13:45:50 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[4] at map at <console>:37), which has no missing parents
  438. 15/06/18 13:45:50 INFO MemoryStore: ensureFreeSpace(1608) called with curMem=51392, maxMem=259333816
  439. 15/06/18 13:45:50 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 1608.0 B, free 247.3 MB)
  440. 15/06/18 13:45:50 INFO MemoryStore: ensureFreeSpace(1046) called with curMem=53000, maxMem=259333816
  441. 15/06/18 13:45:50 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1046.0 B, free 247.3 MB)
  442. 15/06/18 13:45:50 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:49443 (size: 1046.0 B, free: 247.3 MB)
  443. 15/06/18 13:45:50 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:874
  444. 15/06/18 13:45:50 INFO DAGScheduler: Submitting 11 missing tasks from ResultStage 1 (MapPartitionsRDD[4] at map at <console>:37)
  445. 15/06/18 13:45:50 INFO TaskSchedulerImpl: Adding task set 1.0 with 11 tasks
  446. 15/06/18 13:46:00 INFO JobScheduler: Added jobs for time 1434627960000 ms
  447. 15/06/18 13:46:10 INFO JobScheduler: Added jobs for time 1434627970000 ms
  448. 15/06/18 13:46:20 INFO JobScheduler: Added jobs for time 1434627980000 ms
  449. 15/06/18 13:46:30 INFO JobScheduler: Added jobs for time 1434627990000 ms
  450. 15/06/18 13:46:40 INFO JobScheduler: Added jobs for time 1434628000000 ms
  451. 15/06/18 13:46:50 INFO JobScheduler: Added jobs for time 1434628010000 ms
  452. 15/06/18 13:47:00 INFO JobScheduler: Added jobs for time 1434628020000 ms
  453. 15/06/18 13:47:10 INFO JobScheduler: Added jobs for time 1434628030000 ms
  454. 15/06/18 13:47:20 INFO JobScheduler: Added jobs for time 1434628040000 ms
  455. 15/06/18 13:47:30 INFO JobScheduler: Added jobs for time 1434628050000 ms
  456. 15/06/18 13:47:40 INFO JobScheduler: Added jobs for time 1434628060000 ms
  457. 15/06/18 13:47:50 INFO JobScheduler: Added jobs for time 1434628070000 ms
  458. 15/06/18 13:48:00 INFO JobScheduler: Added jobs for time 1434628080000 ms
  459. 15/06/18 13:48:10 INFO JobScheduler: Added jobs for time 1434628090000 ms
  460. 15/06/18 13:48:20 INFO JobScheduler: Added jobs for time 1434628100000 ms
  461. 15/06/18 13:48:30 INFO JobScheduler: Added jobs for time 1434628110000 ms
  462. 15/06/18 13:48:40 INFO JobScheduler: Added jobs for time 1434628120000 ms
  463. 15/06/18 13:48:50 INFO JobScheduler: Added jobs for time 1434628130000 ms
  464. 15/06/18 13:49:00 INFO JobScheduler: Added jobs for time 1434628140000 ms
  465. 15/06/18 13:49:05 INFO MemoryStore: ensureFreeSpace(2899) called with curMem=54046, maxMem=259333816
  466. 15/06/18 13:49:05 INFO MemoryStore: Block input-0-1434628145400 stored as bytes in memory (estimated size 2.8 KB, free 247.3 MB)
  467. 15/06/18 13:49:05 INFO BlockManagerInfo: Added input-0-1434628145400 in memory on localhost:49443 (size: 2.8 KB, free: 247.3 MB)
  468. 15/06/18 13:49:05 INFO BlockGenerator: Pushed block input-0-1434628145400
  469. 15/06/18 13:49:05 INFO MemoryStore: ensureFreeSpace(3099) called with curMem=56945, maxMem=259333816
  470. 15/06/18 13:49:05 INFO MemoryStore: Block input-0-1434628145600 stored as bytes in memory (estimated size 3.0 KB, free 247.3 MB)
  471. 15/06/18 13:49:05 INFO BlockManagerInfo: Added input-0-1434628145600 in memory on localhost:49443 (size: 3.0 KB, free: 247.3 MB)
  472. 15/06/18 13:49:05 INFO BlockGenerator: Pushed block input-0-1434628145600
  473. 15/06/18 13:49:06 INFO MemoryStore: ensureFreeSpace(3449) called with curMem=60044, maxMem=259333816
  474. 15/06/18 13:49:06 INFO MemoryStore: Block input-0-1434628145800 stored as bytes in memory (estimated size 3.4 KB, free 247.3 MB)
  475. 15/06/18 13:49:06 INFO BlockManagerInfo: Added input-0-1434628145800 in memory on localhost:49443 (size: 3.4 KB, free: 247.3 MB)
  476. 15/06/18 13:49:06 INFO BlockGenerator: Pushed block input-0-1434628145800
  477. 15/06/18 13:49:06 INFO MemoryStore: ensureFreeSpace(3199) called with curMem=63493, maxMem=259333816
  478. 15/06/18 13:49:06 INFO MemoryStore: Block input-0-1434628146000 stored as bytes in memory (estimated size 3.1 KB, free 247.3 MB)
  479. 15/06/18 13:49:06 INFO BlockManagerInfo: Added input-0-1434628146000 in memory on localhost:49443 (size: 3.1 KB, free: 247.3 MB)
  480. 15/06/18 13:49:06 INFO BlockGenerator: Pushed block input-0-1434628146000
  481. 15/06/18 13:49:06 INFO MemoryStore: ensureFreeSpace(3049) called with curMem=66692, maxMem=259333816
  482. 15/06/18 13:49:06 INFO MemoryStore: Block input-0-1434628146200 stored as bytes in memory (estimated size 3.0 KB, free 247.3 MB)
  483. 15/06/18 13:49:06 INFO BlockManagerInfo: Added input-0-1434628146200 in memory on localhost:49443 (size: 3.0 KB, free: 247.3 MB)
  484. 15/06/18 13:49:06 INFO BlockGenerator: Pushed block input-0-1434628146200
  485. 15/06/18 13:49:06 INFO MemoryStore: ensureFreeSpace(3149) called with curMem=69741, maxMem=259333816
  486. 15/06/18 13:49:06 INFO MemoryStore: Block input-0-1434628146400 stored as bytes in memory (estimated size 3.1 KB, free 247.3 MB)
  487. 15/06/18 13:49:06 INFO BlockManagerInfo: Added input-0-1434628146400 in memory on localhost:49443 (size: 3.1 KB, free: 247.3 MB)
  488. 15/06/18 13:49:06 INFO BlockGenerator: Pushed block input-0-1434628146400
  489. 15/06/18 13:49:06 INFO MemoryStore: ensureFreeSpace(3049) called with curMem=72890, maxMem=259333816
  490. 15/06/18 13:49:06 INFO MemoryStore: Block input-0-1434628146600 stored as bytes in memory (estimated size 3.0 KB, free 247.2 MB)
  491. 15/06/18 13:49:06 INFO BlockManagerInfo: Added input-0-1434628146600 in memory on localhost:49443 (size: 3.0 KB, free: 247.3 MB)
  492. 15/06/18 13:49:06 INFO BlockGenerator: Pushed block input-0-1434628146600
  493. 15/06/18 13:49:07 INFO MemoryStore: ensureFreeSpace(3249) called with curMem=75939, maxMem=259333816
  494. 15/06/18 13:49:07 INFO MemoryStore: Block input-0-1434628146800 stored as bytes in memory (estimated size 3.2 KB, free 247.2 MB)
  495. 15/06/18 13:49:07 INFO BlockManagerInfo: Added input-0-1434628146800 in memory on localhost:49443 (size: 3.2 KB, free: 247.3 MB)
  496. 15/06/18 13:49:07 INFO BlockGenerator: Pushed block input-0-1434628146800
  497. 15/06/18 13:49:07 INFO MemoryStore: ensureFreeSpace(2476) called with curMem=79188, maxMem=259333816
  498. 15/06/18 13:49:07 INFO MemoryStore: Block input-0-1434628147000 stored as bytes in memory (estimated size 2.4 KB, free 247.2 MB)
  499. 15/06/18 13:49:07 INFO BlockManagerInfo: Added input-0-1434628147000 in memory on localhost:49443 (size: 2.4 KB, free: 247.3 MB)
  500. 15/06/18 13:49:07 INFO BlockGenerator: Pushed block input-0-1434628147000
  501. 15/06/18 13:49:10 INFO JobScheduler: Added jobs for time 1434628150000 ms
  502. 15/06/18 13:49:20 INFO JobScheduler: Added jobs for time 1434628160000 ms
  503.  
  504. {code}
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement