Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- >>> x = torch.tensor([1.0, np.NaN])
- >>> y = torch.tensor([2.0, 3.0])
- >>> z = torch.div(y, x)
- >>> z
- tensor([2., nan])
- >>> z.backward()
- Traceback (most recent call last):
- File "<stdin>", line 1, in <module>
- File "/usr/local/lib/python3.6/dist-packages/torch/tensor.py", line 107, in backward
- torch.autograd.backward(self, gradient, retain_graph, create_graph)
- File "/usr/local/lib/python3.6/dist-packages/torch/autograd/__init__.py", line 93, in backward
- allow_unreachable=True) # allow_unreachable flag
- RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement