Advertisement
Guest User

Untitled

a guest
Jun 24th, 2019
121
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.44 KB | None | 0 0
  1. lstm_out = lstm_out.contiguous().view(-1, self.hidden_dim)
  2.  
  3. lstm_out, hidden = self.lstm(embeds, hidden)
  4.  
  5. lstm_out, hidden = self.lstm(embeds, hidden)
  6. print(lstm_out) # imagine a sample output like [1,0 , 2,0]
  7. # forward out | backward out
  8.  
  9. stacked = lstm_out.contiguous().view(-1,hidden_dim) # hidden_dim = 2
  10.  
  11. print(stacked) # torch.Tensor([[1,0],
  12. # [2,0]])
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement