Guest User

Untitled

a guest
Oct 21st, 2017
74
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.35 KB | None | 0 0
  1. import numpy as np
  2. import tensorflow as tf
  3.  
  4. sess = tf.Session()
  5. in_vec = tf.constant(np.array([1, 2, 3], dtype='float32'))
  6. one_hot = tf.constant(np.array([0, 1, 0], dtype='float32'))
  7. in_grad = tf.gradients(tf.nn.softmax_cross_entropy_with_logits(labels=one_hot, logits=in_vec), in_vec)[0]
  8. print(sess.run(in_grad))
  9. print(sess.run(tf.nn.softmax(in_vec) - one_hot))
Add Comment
Please, Sign In to add comment