Advertisement
Guest User

Untitled

a guest
Oct 30th, 2014
127
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.40 KB | None | 0 0
  1. function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
  2.  
  3. m = length(y); % number of training examples
  4. n = length(theta); % number of features
  5. J_history = zeros(num_iters, 1);
  6. error = ((theta' * X')' - y)*(alpha/m);
  7. descent = zeros(size(theta),1);
  8.  
  9. for iter = 1:num_iters
  10. for i = 1:n
  11. descent(i) = descent(i) + sum(error.* X(:,i));
  12. i = i + 1;
  13. end
  14.  
  15. theta = theta - descent;
  16. end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement