Advertisement
Abracax

Untitled

Oct 9th, 2019
484
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.61 KB | None | 0 0
  1. function STOCHASTIC GRADIENT DESCENT(L(), f(), x, y) returns q
  2. # where: L is the loss function
  3. # f is a function parameterized by q
  4. # x is the set of training inputs x(1)
  5. , x(2)
  6. ,..., x(n)
  7. # y is the set of training outputs (labels) y(1)
  8. , y(2)
  9. ,..., y(n)
  10. q 0
  11. repeat T times
  12. For each training tuple (x(i)
  13. , y(i)
  14. ) (in random order)
  15. Compute ˆy(i) = f(x(i)
  16. ;q) # What is our estimated output ˆy?
  17. Compute the loss L(yˆ(i)
  18. , y(i)
  19. ) # How far off is ˆy(i)
  20. ) from the true output y(i)
  21. ?
  22. g —q L(f(x(i)
  23. ;q), y(i)
  24. ) # How should we move q to maximize loss ?
  25. q q h g # go the other way instead
  26. return q
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement