Advertisement
Jesorx

Neural network activations

Apr 9th, 2020
255
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Python 0.38 KB | None | 0 0
  1. def sigmoid(x):
  2.     return 1 / (1 + numpy.exp(-x))
  3.  
  4.  
  5. def tanh(x):
  6.     return numpy.tanh(x)
  7.  
  8.  
  9. def relu(x):
  10.     return numpy.maximum(0, x)
  11.  
  12.  
  13. def sigmoid_derivative(x):
  14.     sig = sigmoid(x)
  15.     return sig * (1 - sig)
  16.  
  17. def tanh_derivative(x):
  18.     tan = tanh(x)
  19.     return 1 - tan ** 2
  20.  
  21.  
  22. def relu_derivative(x):
  23.     y = numpy.zeros_like(x)
  24.     y[x = 0] = 1
  25.     return y
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement