Advertisement
Maroxtn

Untitled

Jul 12th, 2021
46
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.83 KB | None | 0 0
  1. What is the equivalent of extending autograd functionalities in mindspore ? And what is the equivalent module to autograd.Function in mindspore. Thank you.
  2.  
  3. class adder(torch.autograd.Function):
  4. @staticmethod
  5. def forward(ctx, W_col, X_col):
  6. ctx.save_for_backward(W_col,X_col)
  7. output = -(W_col.unsqueeze(2)-X_col.unsqueeze(0)).abs().sum(1)
  8. return output
  9.  
  10. @staticmethod
  11. def backward(ctx,grad_output):
  12. W_col,X_col = ctx.saved_tensors
  13. grad_W_col = ((X_col.unsqueeze(0)-W_col.unsqueeze(2))*grad_output.unsqueeze(1)).sum(2)
  14. grad_W_col = grad_W_col/grad_W_col.norm(p=2).clamp(min=1e-12)*math.sqrt(W_col.size(1)*W_col.size(0))/5
  15. grad_X_col = (-(X_col.unsqueeze(0)-W_col.unsqueeze(2)).clamp(-1,1)*grad_output.unsqueeze(1)).sum(0)
  16.  
  17. return grad_W_col, grad_X_col
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement