Advertisement
Guest User

Untitled

a guest
Jun 30th, 2016
61
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.54 KB | None | 0 0
  1. // boilerplate when writing python spark script
  2. import sys
  3.  
  4. from pyspark import SparkContext, SparkConf
  5.  
  6. if __name__ == "__main__":
  7.  
  8. conf = (SparkConf()
  9. .setAppName("The App")
  10. .setMaster("local)
  11. .set("spark.executor.memory", "1g"))
  12. sc = SparkContext(conf=conf)
  13.  
  14. // boilerplate is finished
  15. // finally, there is some useful work to do
  16.  
  17. // usually, you want to use the arguments to run a script
  18. // bur, remember that first argument will be the script itself
  19. // sys.argv[0] = 'your_running_py_script.py'
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement