Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
- m = length(y); % number of training examples
- n = length(theta); % number of features
- J_history = zeros(num_iters, 1);
- error = ((theta' * X')' - y)*(alpha/m);
- descent = zeros(size(theta),1);
- for iter = 1:num_iters
- for i = 1:n
- descent(i) = descent(i) + sum(error.* X(:,i));
- i = i + 1;
- end
- theta = theta - descent;
- end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement