Advertisement
Guest User

Untitled

a guest
Jun 25th, 2019
76
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.60 KB | None | 0 0
  1. >>> x = torch.tensor([1.0, np.NaN])
  2. >>> y = torch.tensor([2.0, 3.0])
  3. >>> z = torch.div(y, x)
  4. >>> z
  5. tensor([2., nan])
  6. >>> z.backward()
  7. Traceback (most recent call last):
  8. File "<stdin>", line 1, in <module>
  9. File "/usr/local/lib/python3.6/dist-packages/torch/tensor.py", line 107, in backward
  10. torch.autograd.backward(self, gradient, retain_graph, create_graph)
  11. File "/usr/local/lib/python3.6/dist-packages/torch/autograd/__init__.py", line 93, in backward
  12. allow_unreachable=True) # allow_unreachable flag
  13. RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement