SHARE
TWEET

Untitled

a guest Aug 14th, 2019 73 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. def forward(self, batch):
  2.     # batch shape [batch_size, seq_len, embedding_dim]
  3.     # batch - dict('tokens': tokens[batch_size, seq_len, embedding_dim],
  4.     #              'head': head[batch_size, seq_len, char_pad_len,],
  5.     #              'tail': tail[batch_size, seq_len, char_pad_len,]
  6.     #             )
  7.     tokens, head, tail = batch['tokens'], batch['head'], batch['tail']    
  8.     ...
  9.  
  10. def training_step(self, data_batch, batch_nb):
  11.     texts, labels = data_batch
  12.     texts['tokens'] = self.token_vectorizer.vectorize(texts['tokens']) # [batch_size, seq_len, embedding_dim]
  13.     texts['head'] = self.char_vectorizer.vectorize_batches(texts['head']) # [batch_size, seq_len, char_pad_len, char_embedding_dim]
  14.     texts['tail'] = self.char_vectorizer.vectorize_batches(texts['tail']) # [batch_size, seq_len, char_pad_len, char_embedding_dim]
  15.     scores = self.forward(texts)
  16.     labels = labels.contiguous().view(-1)
  17.     loss = self.my_loss(scores, labels)
  18.     return {'loss': loss}
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand
 
Top