Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- conf = pyspark.SparkConf()
- # Set number of workers and number of cores
- conf.set('spark.num.executors', 3)
- conf.set('spark.executor.memory', '12g')
- conf.set('spark.executor.cores', 3)
- # Create a spark context
- sc = pyspark.SparkContext(master = 'spark://ip-172-31-8-174.ec2.internal:7077'
- appName = 'featuretools', conf = conf)
Add Comment
Please, Sign In to add comment