Advertisement
Guest User

Untitled

a guest
Feb 21st, 2020
94
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.44 KB | None | 0 0
  1. def linearRegression_Loss(X, t):
  2. # Max iterations, or step count, for gradient descent
  3. maxIters = 100
  4. # Step size, or learning rate
  5. epsilon = .01
  6.  
  7. # Initalize beta to be an array the same size of features of x[i], then set all weights to zero
  8. beta = np.array(X[i].shape())
  9. initializeWeights(beta)
  10.  
  11. print("Features length: " + len(X[i]))
  12. print("Weights length: " + len(beta))
  13.  
  14. return beta
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement