Advertisement
Guest User

Untitled

a guest
Oct 16th, 2019
92
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.66 KB | None | 0 0
  1. def feedForward(self, training_example):
  2. #The values calculated during forward propagation are
  3. #stored to be used during backpropagation
  4.  
  5. #Extract activations for first layer from training_example list
  6. self.a1 = np.array((training_example[0][0],training_example[0][1]))
  7.  
  8. #Calculate the weighted inputs and activations for all other layers in the network
  9. self.z2 = np.dot(self.W1, self.a1.reshape(-1,1))
  10. self.a2 = self.sigmoid(self.z2)
  11. self.z3 = np.dot(self.W2, self.a2)
  12. self.a3 = self.sigmoid(self.z3)
  13. #return the activations of the neuron in the output layer
  14. return self.a3
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement