• API
• FAQ
• Tools
• Archive
daily pastebin goal
42%
SHARE
TWEET

# Untitled

a guest Jan 21st, 2018 42 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
1. def gradient_descent_iterate(alpha, x, y, ep, max_iter):
2.
3. # Set-up of variables
4. converged = False
5. iter = 0
6. m = x.shape[0]
7. t0 = np.random.random(x.shape[1])
8. t1 = np.random.random(x.shape[1])
9.
10. # Calculate the cost
11. J = -(1/m)*sum([(-y[i]*log(1/(1+exp(t0 + t1*x[i]))) + (1-y[i])*log(1-(1/(1+exp(t0 + t1*x[i]))))) for i in range(m)])
12.
13. # Calculate and update the gradient
14. while(not converged):
15.     # for each iteration, compute the gradient
16.     grad0 = (1/m) * sum([(t0 + t1*x[i] - y[i]) for i in range(m)])
17.     grad1 = (1/m) * sum([(t0 + t1*x[i] - y[i])*x[i] for i in range(m)])
18.     # Save to temp
19.     temp0 = t0 - alpha*grad0
20.     temp1 = t1 - alpha*grad1
21.     # Update theta
22.     t0 = temp0
23.     t1 = temp1
24.
25.     #Calculate MSE in each iteration
26.     e = (1/2*m)*sum([(t0 + t1*x[i] - y[i])**2 for i in range(m)])
27.     #Check for early convergence
28.     if(abs(J-e)<=ep):
29.         print('Optimizer has converged, iterations',iter,'!')
30.         converged = True
31.     #Update error
32.     J = e
33.     iter = iter+1
34.     #End algorithm after max_iter iterations
35.     if (iter == max_iter):
36.         print('Maximum iterations exceeded')
37.         converged = True
38. return t0, t1
39.
40. Optimizer has converged, iterations 200 !
41. Theta0 =  [ 0.46931187] , Theta1 =  [ 0.44905758]
42.
43. 0.46931187 + 0.44905758 * -0.4708 = 0.2  which is not <0
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy.

Top