Advertisement
Guest User

Untitled

a guest
Jun 28th, 2017
59
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.01 KB | None | 0 0
  1. """
  2. Given the starting point of any `x` gradient descent
  3. should be able to find the minimum value of x for the
  4. cost function `f` defined below.
  5. """
  6. import random
  7.  
  8. def f(x):
  9. """
  10. Quadratic function.
  11.  
  12. It's easy to see the minimum value of the function
  13. is 5 when is x=0.
  14. """
  15. return x**2 + 5
  16.  
  17.  
  18. def df(x):
  19. """
  20. Derivative of `f` with respect to `x`.
  21. """
  22. return 2*x
  23.  
  24. def gradient_descent_update(x, gradx, learning_rate):
  25. """
  26. Performs a gradient descent update.
  27. """
  28. # TODO: Implement gradient descent.
  29. # x = x - learning_rate * gradient_of_x
  30. x = x - (gradx * learning_rate)
  31.  
  32. # Return the new value for x
  33. return x
  34.  
  35. # Random number between 0 and 10,000. Feel free to set x whatever you like.
  36. x = random.randint(0, 10000)
  37. # TODO: Set the learning rate
  38. learning_rate = 0.1
  39. epochs = 100
  40.  
  41. for i in range(epochs+1):
  42. cost = f(x)
  43. gradx = df(x)
  44. print("EPOCH {}: Cost = {:.3f}, x = {:.3f}".format(i, cost, gradx))
  45. x = gradient_descent_update(x, gradx, learning_rate)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement