Guest User

Untitled

a guest
Jan 22nd, 2018
111
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 3.27 KB | None | 0 0
  1. from keras.datasets import mnist
  2. from keras.models import Sequential
  3. from keras.layers import Dense
  4. from keras.optimizers import SGD
  5. from keras.initializers import RandomNormal
  6.  
  7.  
  8. # import data
  9. (x_train, y_train), (x_test, y_test) = mnist.load_data()
  10.  
  11. # input image dimensions
  12. img_rows, img_cols = 28, 28
  13.  
  14. x_train = x_train.reshape(x_train.shape[0], img_rows * img_cols)
  15. x_test = x_test.reshape(x_test.shape[0], img_rows * img_cols)
  16. input_shape = (img_rows * img_cols,)
  17.  
  18. x_train = x_train.astype('float32')
  19. x_test = x_test.astype('float32')
  20. x_train /= 255
  21. x_test /= 255
  22. print('x_train shape:', x_train.shape)
  23. print(x_train.shape[0], 'train samples')
  24. print(x_test.shape[0], 'test samples')
  25.  
  26. # convert class vectors to binary class matrices
  27. num_classes = 10
  28. y_train = keras.utils.to_categorical(y_train, num_classes)
  29. y_test = keras.utils.to_categorical(y_test, num_classes)
  30. print('y_train shape:', y_train.shape)
  31.  
  32. # Construct model
  33. # 784 * 30 * 10
  34. # Normal distribution for weights/biases
  35. # Stochastic Gradient Descent optimizer
  36. # Mean squared error loss (cost function)
  37. model = Sequential()
  38. layer1 = Dense(30,
  39. input_shape=input_shape,
  40. kernel_initializer=RandomNormal(stddev=1),
  41. bias_initializer=RandomNormal(stddev=1))
  42. model.add(layer1)
  43. layer2 = Dense(10,
  44. kernel_initializer=RandomNormal(stddev=1),
  45. bias_initializer=RandomNormal(stddev=1))
  46. model.add(layer2)
  47. print('Layer 1 input shape: ', layer1.input_shape)
  48. print('Layer 1 output shape: ', layer1.output_shape)
  49. print('Layer 2 input shape: ', layer2.input_shape)
  50. print('Layer 2 output shape: ', layer2.output_shape)
  51.  
  52. model.summary()
  53. model.compile(optimizer=SGD(lr=3.0),
  54. loss='mean_squared_error',
  55. metrics=['accuracy'])
  56.  
  57. # Train
  58. model.fit(x_train,
  59. y_train,
  60. batch_size=10,
  61. epochs=30,
  62. verbose=2)
  63.  
  64. # Run on test data and output results
  65. result = model.evaluate(x_test,
  66. y_test,
  67. verbose=1)
  68. print('Test loss: ', result[0])
  69. print('Test accuracy: ', result[1])
  70.  
  71. Using TensorFlow backend.
  72. x_train shape: (60000, 784)
  73. 60000 train samples
  74. 10000 test samples
  75. y_train shape: (60000, 10)
  76. Layer 1 input shape: (None, 784)
  77. Layer 1 output shape: (None, 30)
  78. Layer 2 input shape: (None, 30)
  79. Layer 2 output shape: (None, 10)
  80. _________________________________________________________________
  81. Layer (type) Output Shape Param #
  82. =================================================================
  83. dense_1 (Dense) (None, 30) 23550
  84. _________________________________________________________________
  85. dense_2 (Dense) (None, 10) 310
  86. =================================================================
  87. Total params: 23,860
  88. Trainable params: 23,860
  89. Non-trainable params: 0
  90. _________________________________________________________________
  91. Epoch 1/30
  92. - 7s - loss: nan - acc: 0.0987
  93. Epoch 2/30
  94. - 7s - loss: nan - acc: 0.0987
  95.  
  96. Epoch 30/30
  97. - 6s - loss: nan - acc: 0.0987
  98. 10000/10000 [==============================] - 0s 22us/step
  99. Test loss: nan
  100. Test accuracy: 0.098
  101.  
  102. layer2 = Dense(10,
  103. activation='sigmoid',
  104. kernel_initializer=RandomNormal(stddev=1),
  105. bias_initializer=RandomNormal(stddev=1))
Add Comment
Please, Sign In to add comment