Advertisement
Guest User

Untitled

a guest
Mar 24th, 2019
69
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.38 KB | None | 0 0
  1. def build_mlp(input_placeholder, output_size, scope, n_layers, size, activation=tf.tanh, output_activation=None):
  2. """
  3. Builds a feedforward neural network
  4.  
  5. arguments:
  6. input_placeholder: placeholder variable for the state (batch_size, input_size)
  7. output_size: size of the output layer
  8. scope: variable scope of the network
  9. n_layers: number of hidden layers
  10. size: dimension of the hidden layer
  11. activation: activation of the hidden layers
  12. output_activation: activation of the ouput layers
  13. returns:
  14. output placeholder of the network (the result of a forward pass)
  15. Hint: use tf.layers.dense
  16. """
  17.  
  18. # with tf.variable_scope(scope):
  19. # layers = tf_layers.stack(input_placeholder, tf_layers.fully_connected, [size] * (n_layers - 1),
  20. # scope = 'fc', activation_fn = activation)
  21.  
  22. # output_placeholder = tf_layers.fully_connected(layers, output_size, activation_fn = output_activation)
  23.  
  24.  
  25. out = input_placeholder
  26. for l in range(n_layers-1):
  27. out = tf.layers.dense(inputs=out, units=size, activation=activation)
  28. out = tf.layers.dense(inputs=out, units=output_size, activation=output_activation)
  29. return out
  30.  
  31. return output_placeholder
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement