Guest User

Untitled

a guest
Oct 26th, 2011
468
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
MatLab 1.10 KB | None | 0 0
  1. function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
  2. %GRADIENTDESCENT Performs gradient descent to learn theta
  3. %   theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by
  4. %   taking num_iters gradient steps with learning rate alpha
  5.  
  6. % Initialize some useful values
  7. m = length(y); % number of training examples
  8. J_history = zeros(num_iters, 1);
  9. grad=0
  10. for iter = 1:num_iters,
  11.  
  12.     % ====================== YOUR CODE HERE ======================
  13.     % Instructions: Perform a single gradient step on the parameter vector
  14.     %               theta.
  15.     %
  16.     % Hint: While debugging, it can be useful to print out the values
  17.     %       of the cost function (computeCost) and gradient here.
  18.     %
  19.  
  20.     for i = 1:m,
  21.         grad = grad + (theta'*X(i,:)'-y(i))*X(i,:)';
  22.     end
  23.     theta = theta - (alpha/m)*grad;
  24.  
  25.    
  26.     %grad = X' * (X*theta - y);
  27.     %theta = theta - (alpha/m) * grad;
  28.  
  29.     % ============================================================
  30.  
  31.     % Save the cost J in every iteration    
  32.     J_history(iter) = computeCost(X, y, theta);
  33.  
  34. end
  35. end
  36.  
  37.  
Advertisement
Add Comment
Please, Sign In to add comment