Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- function [J, grad] = costFunctionReg(theta, X, y, lambda)
- %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
- % J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
- % theta as the parameter for regularized logistic regression and the
- % gradient of the cost w.r.t. to the parameters.
- % Initialize some useful values
- m = length(y); % number of training examples
- n = size(X, 2); % number of features
- % You need to return the following variables correctly
- J = 0;
- grad = zeros(size(theta));
- % ====================== YOUR CODE HERE ======================
- % Instructions: Compute the cost of a particular choice of theta.
- % You should set J to the cost.
- % Compute the partial derivatives and set grad to the partial
- % derivatives of the cost w.r.t. each parameter in theta
- % ==== Evaluating the cost ====
- %First part, without Regularization
- for i = 1:m
- hypothesis = sigmoid(theta' * X(i, :)');
- J += y(i) * log(hypothesis) + (1 - y(i)) * log(1 - hypothesis);
- end;
- J *= -1 / m;
- %Regularization (penalizing the parameters)
- secondPart = 0;
- for j = 2:n
- secondPart += theta(j) ^ 2;
- end;
- secondPart *= lambda / (2 * m);
- J += secondPart;
- % ==== Evaluating the gradient ====
- for i = 1:m
- hypothesis = sigmoid(theta' * X(i, :)');
- grad(1) += (hypothesis - y(i)) * X(i, 1);
- end;
- grad(1) *= 1 / m;
- for j = 2:n
- for i = 1:m
- hypothesis = sigmoid(theta' * X(i, :)');
- grad(j) += (hypothesis - y(i)) * X(i, j);
- end;
- grad(j) *= 1 / m;
- grad(j) += (lambda * theta(j)) / m;
- end;
- % =============================================================
- end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement