Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- def compute_cost_with_regularization(A3, Y, parameters, lambd):
- """
- Implement the cost function with L2 regularization. See formula (2) above.
- Arguments:
- A3 -- post-activation, output of forward propagation, of shape (output size, number of examples)
- Y -- "true" labels vector, of shape (output size, number of examples)
- parameters -- python dictionary containing parameters of the model
- Returns:
- cost - value of the regularized loss function (formula (2))
- """
- m = Y.shape[1]
- W1 = parameters["W1"]
- W2 = parameters["W2"]
- W3 = parameters["W3"]
- cross_entropy_cost = compute_cost(A3, Y) # This gives you the cross-entropy part of the cost
- ### START CODE HERE ### (approx. 1 line)
- L2_regularization_cost = 1/m*lambd/2*(np.sum(np.square(W1))+np.sum(np.square(W2))+np.sum(np.square(W3)))
- ### END CODER HERE ###
- cost = cross_entropy_cost + L2_regularization_cost
- return cost
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement