Guest User

Untitled

a guest
Jan 19th, 2018
87
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.27 KB | None | 0 0
  1. from pyspark import SparkContext
  2.  
  3. import random
  4.  
  5. def inside(p):
  6. x, y = random.random(), random.random()
  7. return x*x + y*y < 1
  8.  
  9. NUM_SAMPLES = 1000
  10.  
  11. count = sc.parallelize(xrange(0, NUM_SAMPLES)).filter(inside).count()
  12. print "Pi is roughly %f" % (4.0 * count / NUM_SAMPLES)
Add Comment
Please, Sign In to add comment