Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- scala> val salgrp=salname.groupByKey
- salgrp: org.apache.spark.rdd.RDD[(Int, Iterable[String])] = ShuffledRDD[11] at groupByKey at <console>:41
- scala> salgrp.collect
- 18/09/04 20:51:06 INFO DAGScheduler: Job 0 finished: collect at <console>:44, took 1.723661 s
- res0: Array[(Int, Iterable[String])] = Array((50000,CompactBuffer(Bhupesh, Tejas, Dinesh, Lokesh)), (10000,CompactBuffer(Sheela, Kumar, Venkat)), (45000,CompactBuf
- fer(Pavan, Ratan, Amit)))
- val data = List((1, List("one", "two", "three")))
- val rdd = sparkContext.parallelize(data)
- rdd.flatMap(v => v._2).foreach(println)
- one
- two
- three
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement