Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- myDf.show(30)
- myDf.write.mode(SaveMode.Append)
- .jdbc("jdbc:postgresql://<connectionString>", "apptest", props)
- +---------+------+-----------+-----+
- |countryid|kindid|publisherid| rank|
- +---------+------+-----------+-----+
- | 2| 5| 1440370637| 143|
- | 2| 5| 2114144780| 1558|
- | 2| 8| 199559784| 3684|
- | 2| 5| 1522258372| 1016|
- | 2| 9| 918480276| 1051|
- | 2| 5| 8788| 619|
- | 2| 5| 2760330013| 612|
- | 2| 5| 1733022508| 3427|
- | 2| 10| 1733020347| 2010|
- | 2| 8| 11867| 9431|
- | 2| 7| 512462197| 5089|
- | 2| 8| 1219783067| 83|
- | 2| 5| 1459628648| 4990|
- | 2| 5| 2447786955| 155|
- | 2| 5| 1985613413| 651|
- | 2| 5| 1560140870| 404|
- | 2| 5| 1445788622| 4737|
- | 2| 7| 2352155047|19288|
- | 2| 5| 1667560844| 87|
- | 2| 5| 1932266564| 764|
- | 2| 5| 1470914641| 31|
- | 2| 8| 2601082735| 341|
- | 2| 5| 1985796127| 21|
- | 2| 8| 510919182| 994|
- | 2| 5| 338483845| 3292|
- | 2| 5| 2303279333| 809|
- | 2| 5| 2454300331| 620|
- | 2| 8| 379213322| 570|
- | 2| 7| 260492638|20219|
- | 2| 5| 1435774759| 78|
- +---------+------+-----------+-----+
- 18/05/14 16:04:05 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 276.3 KB, free 363.7 MB)
- 18/05/14 16:04:05 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 23.2 KB, free 363.7 MB)
- 18/05/14 16:04:05 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on 192.168.0.17:62045 (size: 23.2 KB, free: 366.1 MB)
- 18/05/14 16:04:05 INFO SparkContext: Created broadcast 9 from newAPIHadoopFile at RedshiftRelation.scala:146
- 18/05/14 16:04:05 INFO MemoryStore: Block broadcast_10 stored as values in memory (estimated size 276.3 KB, free 363.4 MB)
- 18/05/14 16:04:06 INFO MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 23.2 KB, free 363.4 MB)
- 18/05/14 16:04:06 INFO BlockManagerInfo: Added broadcast_10_piece0 in memory on 192.168.0.17:62045 (size: 23.2 KB, free: 366.1 MB)
- 18/05/14 16:04:06 INFO SparkContext: Created broadcast 10 from newAPIHadoopFile at RedshiftRelation.scala:146
- 18/05/14 16:04:06 INFO MemoryStore: Block broadcast_11 stored as values in memory (estimated size 276.3 KB, free 363.1 MB)
- 18/05/14 16:04:06 INFO MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 23.2 KB, free 363.1 MB)
- 18/05/14 16:04:06 INFO BlockManagerInfo: Added broadcast_11_piece0 in memory on 192.168.0.17:62045 (size: 23.2 KB, free: 366.0 MB)
- 18/05/14 16:04:06 INFO SparkContext: Created broadcast 11 from newAPIHadoopFile at RedshiftRelation.scala:146
- 18/05/14 16:04:06 INFO MemoryStore: Block broadcast_12 stored as values in memory (estimated size 276.3 KB, free 362.8 MB)
- 18/05/14 16:04:06 INFO MemoryStore: Block broadcast_12_piece0 stored as bytes in memory (estimated size 23.2 KB, free 362.8 MB)
- 18/05/14 16:04:06 INFO BlockManagerInfo: Added broadcast_12_piece0 in memory on 192.168.0.17:62045 (size: 23.2 KB, free: 366.0 MB)
- 18/05/14 16:04:06 INFO SparkContext: Created broadcast 12 from newAPIHadoopFile at RedshiftRelation.scala:146
- 18/05/14 16:04:06 INFO FileInputFormat: Total input paths to process : 1
- 18/05/14 16:04:06 INFO FileInputFormat: Total input paths to process : 1
- 18/05/14 16:04:06 INFO FileInputFormat: Total input paths to process : 1
- 18/05/14 16:04:06 INFO FileInputFormat: Total input paths to process : 1
- 18/05/14 16:04:06 INFO SparkContext: Starting job: jdbc at MyAppWriter.scala:29
- 18/05/14 16:04:06 INFO DAGScheduler: Got job 1 (jdbc at MyAppWriter.scala:29) with 4 output partitions
- 18/05/14 16:04:06 INFO DAGScheduler: Final stage: ResultStage 1 (jdbc at MyAppWriter.scala:29)
- 18/05/14 16:04:06 INFO DAGScheduler: Parents of final stage: List()
- 18/05/14 16:04:06 INFO DAGScheduler: Missing parents: List()
- 18/05/14 16:04:06 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[28] at jdbc at MyAppWriter.scala:29), which has no missing parents
- 18/05/14 16:04:06 INFO MemoryStore: Block broadcast_13 stored as values in memory (estimated size 13.2 KB, free 362.8 MB)
- 18/05/14 16:04:06 INFO MemoryStore: Block broadcast_13_piece0 stored as bytes in memory (estimated size 6.8 KB, free 362.8 MB)
- 18/05/14 16:04:06 INFO BlockManagerInfo: Added broadcast_13_piece0 in memory on 192.168.0.17:62045 (size: 6.8 KB, free: 366.0 MB)
- 18/05/14 16:04:06 INFO SparkContext: Created broadcast 13 from broadcast at DAGScheduler.scala:1006
- 18/05/14 16:04:06 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 1 (MapPartitionsRDD[28] at jdbc at MyAppWriter.scala:29) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
- 18/05/14 16:04:06 INFO TaskSchedulerImpl: Adding task set 1.0 with 4 tasks
- 18/05/14 16:04:06 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, PROCESS_LOCAL, 5072 bytes)
- 18/05/14 16:04:06 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 2, localhost, executor driver, partition 1, PROCESS_LOCAL, 5072 bytes)
- 18/05/14 16:04:06 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 3, localhost, executor driver, partition 2, PROCESS_LOCAL, 5072 bytes)
- 18/05/14 16:04:06 INFO TaskSetManager: Starting task 3.0 in stage 1.0 (TID 4, localhost, executor driver, partition 3, PROCESS_LOCAL, 5072 bytes)
- 18/05/14 16:04:06 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
- 18/05/14 16:04:06 INFO Executor: Running task 1.0 in stage 1.0 (TID 2)
- 18/05/14 16:04:06 INFO Executor: Running task 2.0 in stage 1.0 (TID 3)
- 18/05/14 16:04:06 INFO Executor: Running task 3.0 in stage 1.0 (TID 4)
- 18/05/14 16:04:06 INFO NewHadoopRDD: Input split: s3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0000_part_00:0+55124
- 18/05/14 16:04:06 INFO NewHadoopRDD: Input split: s3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0002_part_00:0+54686
- 18/05/14 16:04:06 INFO NewHadoopRDD: Input split: s3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0003_part_00:0+53651
- 18/05/14 16:04:06 INFO NewHadoopRDD: Input split: s3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0001_part_00:0+53475
- 18/05/14 16:04:06 INFO NativeS3FileSystem: Opening 's3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0000_part_00' for reading
- 18/05/14 16:04:06 INFO NativeS3FileSystem: Opening 's3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0001_part_00' for reading
- 18/05/14 16:04:06 INFO NativeS3FileSystem: Opening 's3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0002_part_00' for reading
- 18/05/14 16:04:06 INFO NativeS3FileSystem: Opening 's3n://mybucket-example/mydata/spark/temp/38c92e24-8f9d-486a-b84a-cfdb5964cb15/0003_part_00' for reading
- 18/05/14 16:07:47 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1236 bytes result sent to driver
- 18/05/14 16:07:47 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 220999 ms on localhost (executor driver) (1/4)
- 18/05/14 16:10:40 INFO Executor: Finished task 2.0 in stage 1.0 (TID 3). 1193 bytes result sent to driver
- 18/05/14 16:10:40 INFO TaskSetManager: Finished task 2.0 in stage 1.0 (TID 3) in 393573 ms on localhost (executor driver) (2/4)
- 18/05/14 16:13:35 INFO Executor: Finished task 1.0 in stage 1.0 (TID 2). 1193 bytes result sent to driver
- 18/05/14 16:13:35 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 2) in 569130 ms on localhost (executor driver) (3/4)
- 18/05/14 16:16:30 INFO Executor: Finished task 3.0 in stage 1.0 (TID 4). 1193 bytes result sent to driver
- 18/05/14 16:16:30 INFO TaskSetManager: Finished task 3.0 in stage 1.0 (TID 4) in 744012 ms on localhost (executor driver) (4/4)
- 18/05/14 16:16:30 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
- 18/05/14 16:16:30 INFO DAGScheduler: ResultStage 1 (jdbc at MyAppWriter.scala:29) finished in 744.014 s
- 18/05/14 16:16:30 INFO DAGScheduler: Job 1 finished: jdbc at MyAppWriter.scala:29, took 744.064359 s
Add Comment
Please, Sign In to add comment