Guest User

Untitled

a guest
May 25th, 2018
100
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.40 KB | None | 0 0
  1. # The second parameter(300) is the number of dimension of your embedding model
  2. # padding_idx = 1: specify the <PAD>'s index is 1
  3. # max_norm = 1: If given, will renormalize the embeddings to always have a norm lesser than the given number
  4. embed = nn.Embedding(len(TEXT.vocab), 300, padding_idx=1, max_norm=1)
  5.  
  6. # Copy the word vectors from dataset to nn.Embedding object
  7. embed.weight.data.copy_(TEXT.vocab.vectors)
Add Comment
Please, Sign In to add comment