Advertisement
Guest User

Untitled

a guest
Jun 27th, 2019
82
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 2.73 KB | None | 0 0
  1. X_train shape: (44066, 8, 1)
  2. X_test shape: (5441, 8, 1)
  3. y_train shape: (44066, 8)
  4. y_test shape: (5441, 8)
  5. X_val shape: (4897, 8, 1)
  6. y_val shape: (4897, 8)
  7.  
  8. model = Sequential()
  9. model.add(Conv1D(filters=64, kernel_size=3, activation='relu', input_shape=(8,1)))
  10. model.add(Conv1D(filters=64, kernel_size=3, activation='relu'))
  11. model.add(Dropout(0.5))
  12. model.add(MaxPooling1D(pool_size=2))
  13. model.add(Flatten())
  14. model.add(Dense(100, activation='relu'))
  15. model.add(Dense(8, activation='softmax'))
  16. print(model.summary())
  17.  
  18. model.compile(
  19. optimizer="adam",
  20. loss="categorical_crossentropy",
  21. metrics=['accuracy'])
  22.  
  23.  
  24. model.fit(X_train, y_train, epochs=100, batch_size=32, validation_data=(X_test, y_test),
  25. callbacks=[TestCallback((X_val, y_val))])
  26.  
  27. Epoch 1/100
  28. 44066/44066 [==============================] - 8s 193us/step - loss: 2.0616 - acc: 0.1394 - val_loss: 2.0586 - val_acc: 0.1378
  29.  
  30. Testing loss: 2.0589641586770715, acc: 0.1378394935674903
  31.  
  32. Epoch 2/100
  33. 44066/44066 [==============================] - 5s 113us/step - loss: 2.0556 - acc: 0.1486 - val_loss: 2.0523 - val_acc: 0.1617
  34.  
  35. Testing loss: 2.0537997951842533, acc: 0.15090871962425975
  36.  
  37. Epoch 3/100
  38. 44066/44066 [==============================] - 6s 140us/step - loss: 2.0502 - acc: 0.1595 - val_loss: 2.0506 - val_acc: 0.1608
  39.  
  40. Testing loss: 2.051545942513729, acc: 0.15805595262405556
  41.  
  42. Epoch 4/100
  43. 44066/44066 [==============================] - 5s 122us/step - loss: 2.0484 - acc: 0.1617 - val_loss: 2.0483 - val_acc: 0.1638
  44.  
  45. Testing loss: 2.049225082562022, acc: 0.15683071268123341
  46.  
  47. Epoch 5/100
  48. 44066/44066 [==============================] - 5s 109us/step - loss: 2.0471 - acc: 0.1607 - val_loss: 2.0468 - val_acc: 0.1638
  49.  
  50. Testing loss: 2.048356912879523, acc: 0.16173167245252196
  51.  
  52. Epoch 6/100
  53. 44066/44066 [==============================] - 5s 109us/step - loss: 2.0463 - acc: 0.1623 - val_loss: 2.0452 - val_acc: 0.1619
  54.  
  55. Testing loss: 2.04520957813668, acc: 0.16071063916683684
  56.  
  57. Epoch 7/100
  58. 44066/44066 [==============================] - 5s 109us/step - loss: 2.0452 - acc: 0.1656 - val_loss: 2.0473 - val_acc: 0.1616
  59.  
  60. Testing loss: 2.0473607019037283, acc: 0.16009801919542577
  61.  
  62. Epoch 8/100
  63. 44066/44066 [==============================] - 5s 108us/step - loss: 2.0453 - acc: 0.1631 - val_loss: 2.0451 - val_acc: 0.1632
  64.  
  65. Testing loss: 2.0451171192218265, acc: 0.15989381253828874
  66.  
  67. Epoch 9/100
  68. 44066/44066 [==============================] - 5s 108us/step - loss: 2.0435 - acc: 0.1681 - val_loss: 2.0433 - val_acc: 0.1671
  69.  
  70. Testing loss: 2.0452189264137792, acc: 0.15907698590974065
  71.  
  72. Epoch 10/100
  73. 44066/44066 [==============================] - 5s 108us/step - loss: 2.0433 - acc: 0.1656 - val_loss: 2.0422 - val_acc: 0.1660
  74.  
  75. Testing loss: 2.042981141339474, acc: 0.1613232591382479
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement