Advertisement
Guest User

Untitled

a guest
Oct 18th, 2019
109
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.95 KB | None | 0 0
  1. def compute_cost_with_regularization(A3, Y, parameters, lambd):
  2. """
  3. Implement the cost function with L2 regularization. See formula (2) above.
  4.  
  5. Arguments:
  6. A3 -- post-activation, output of forward propagation, of shape (output size, number of examples)
  7. Y -- "true" labels vector, of shape (output size, number of examples)
  8. parameters -- python dictionary containing parameters of the model
  9.  
  10. Returns:
  11. cost - value of the regularized loss function (formula (2))
  12. """
  13. m = Y.shape[1]
  14. W1 = parameters["W1"]
  15. W2 = parameters["W2"]
  16. W3 = parameters["W3"]
  17.  
  18. cross_entropy_cost = compute_cost(A3, Y) # This gives you the cross-entropy part of the cost
  19.  
  20. ### START CODE HERE ### (approx. 1 line)
  21. L2_regularization_cost = 1/m*lambd/2*(np.sum(np.square(W1))+np.sum(np.square(W2))+np.sum(np.square(W3)))
  22. ### END CODER HERE ###
  23.  
  24. cost = cross_entropy_cost + L2_regularization_cost
  25.  
  26. return cost
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement