SHARE
TWEET

Untitled

a guest Jun 18th, 2019 59 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. # Imports and setup here
  2. # ...
  3.  
  4. def forward(image, label):
  5.   # Implementation excluded
  6.   # ...
  7.  
  8. def train(im, label, lr=.005):
  9.   '''
  10.   Completes a full training step on the given image and label.
  11.   Returns the cross-entropy loss and accuracy.
  12.   - image is a 2d numpy array
  13.   - label is a digit
  14.   - lr is the learning rate
  15.   '''
  16.   # Forward
  17.   out, loss, acc = forward(im, label)
  18.  
  19.   # Calculate initial gradient
  20.   gradient = np.zeros(10)
  21.   gradient[label] = -1 / out[label]
  22.  
  23.   # Backprop
  24.   gradient = softmax.backprop(gradient, lr)
  25.   # TODO: backprop MaxPool2 layer
  26.   # TODO: backprop Conv3x3 layer
  27.  
  28.   return loss, acc
  29.  
  30. print('MNIST CNN initialized!')
  31.  
  32. # Train!
  33. loss = 0
  34. num_correct = 0
  35. for i, (im, label) in enumerate(zip(train_images, train_labels)):
  36.   if i > 0 and i % 99 == 0:
  37.     print(
  38.       '[Step %d] Past 100 steps: Average Loss %.3f | Accuracy: %d%%' %
  39.       (i + 1, loss / 100, num_correct)
  40.     )
  41.     loss = 0
  42.     num_correct = 0
  43.  
  44.   l, acc = train(im, label)
  45.   loss += l
  46.   num_correct += acc
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand
 
Top