Advertisement
Guest User

Untitled

a guest
May 21st, 2019
68
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.47 KB | None | 0 0
  1. Just your regular densely-connected NN layer.
  2.  
  3. Dense implements the operation: output = activation(dot(input, kernel) + bias) where activation is the element-wise activation function passed as the activation argument, kernel is a weights matrix created by the layer, and bias is a bias vector created by the layer (only applicable if use_bias is True).
  4.  
  5. Note: if the input to the layer has a rank greater than 2, then it is flattened prior to the initial dot product with kernel.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement