Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- lstm_out = lstm_out.contiguous().view(-1, self.hidden_dim)
- lstm_out, hidden = self.lstm(embeds, hidden)
- lstm_out, hidden = self.lstm(embeds, hidden)
- print(lstm_out) # imagine a sample output like [1,0 , 2,0]
- # forward out | backward out
- stacked = lstm_out.contiguous().view(-1,hidden_dim) # hidden_dim = 2
- print(stacked) # torch.Tensor([[1,0],
- # [2,0]])
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement