SHARE
TWEET

Untitled

a guest Jun 18th, 2019 60 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. import torch
  2. import torch.nn as nn
  3. import torch.nn.functional as F
  4. import torch.optim as optim
  5.  
  6. class CBOW(nn.Module):
  7.     def __init__(self, vocab_size: int, embedding_dim: int = 100, context_size: int = 4):
  8.         super(CBOW, self).__init__()
  9.         self.embeddings = nn.Embedding(vocab_size, embedding_dim)
  10.         self.linear1 = nn.Linear(context_size * embedding_dim, 128)
  11.         self.linear2 = nn.Linear(128, vocab_size)
  12.  
  13.     def forward(self, inputs):
  14.         embeds = self.embeddings(inputs).view((1, -1))
  15.         out = F.relu(self.linear1(embeds))
  16.         out = self.linear2(out)
  17.         log_probs = F.log_softmax(out, dim=1)
  18.         return log_probs
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand
Not a member of Pastebin yet?
Sign Up, it unlocks many cool features!
 
Top