Advertisement
Guest User

Untitled

a guest
Apr 23rd, 2019
100
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.42 KB | None | 0 0
  1. def prepare_sequence(seq, to_ix):
  2. idxs = [to_ix[w] for w in seq]
  3. return torch.tensor(idxs, dtype=torch.long)
  4.  
  5.  
  6. training_data = dataset[0:900]
  7.  
  8. word_to_ix = {}
  9. for sent, tags in dataset:
  10. for word in sent:
  11. if word not in word_to_ix:
  12. word_to_ix[word] = len(word_to_ix)
  13.  
  14. tag_to_ix = {}
  15.  
  16. for ix, tag in enumerate(unique_tags):
  17. tag_to_ix[tag] = ix
  18.  
  19.  
  20. EMBEDDING_DIM = 64
  21. HIDDEN_DIM = 64
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement