SHARE
TWEET

Untitled

a guest Aug 22nd, 2019 87 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
  2. %GRADIENTDESCENT Performs gradient descent to learn theta
  3. %   theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by
  4. %   taking num_iters gradient steps with learning rate alpha
  5.  
  6. % Initialize some useful values
  7. m = length(y); % number of training examples
  8. J_history = zeros(num_iters, 1);
  9. predict = theta' .* X;
  10.  
  11.  
  12.  
  13. deviation = (sum(predict,2) - y);
  14.  
  15. size(deviation)
  16.  
  17. for iter = 1:num_iters
  18.  
  19.     % ====================== YOUR CODE HERE ======================
  20.     % Instructions: Perform a single gradient step on the parameter vector
  21.     %               theta.
  22.     %
  23.     % Hint: While debugging, it can be useful to print out the values
  24.     %       of the cost function (computeCost) and gradient here.
  25.     %
  26.  
  27.    
  28.     for i = 1:length(theta)
  29.         theta(i) = theta(i) - (alpha/m) * sum(deviation .* X(:,i));
  30.     end
  31.  
  32.    
  33.  
  34.  
  35.    
  36.    
  37.    
  38.  
  39.     % ============================================================
  40.  
  41.     % Save the cost J in every iteration  
  42.     J_history(iter) = computeCost(X, y, theta);
  43.  
  44. end
  45.  
  46. end
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand
 
Top