Advertisement
Guest User

Untitled

a guest
Nov 22nd, 2019
132
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.79 KB | None | 0 0
  1. # GRADED FUNCTION: model
  2.  
  3. def model(input_shape):
  4. """
  5. Function creating the model's graph in Keras.
  6.  
  7. Argument:
  8. input_shape -- shape of the model's input data (using Keras conventions)
  9.  
  10. Returns:
  11. model -- Keras model instance
  12. """
  13.  
  14. X_input = Input(shape = input_shape)
  15.  
  16. ### START CODE HERE ###
  17.  
  18. # Step 1: CONV layer (≈4 lines)
  19. X = Conv1D(196, 15, strides=4)(X_input) # CONV1D
  20. # X = BatchNormalization(axis=-1, momentum=0.99,epsilon=0.001,name="Batch")(X) # Batch normalization
  21. X = BatchNormalization(axis=-1, momentum=0.99,epsilon=0.001,name="Batch1")(X) # Batch normalization
  22. X = Activation('relu')(X) # ReLu activation
  23. X = Dropout(0.8)(X) # dropout (use 0.8)
  24.  
  25. # Step 2: First GRU Layer (≈4 lines)
  26. X = GRU(units = 128, return_sequences=True)(X) # GRU (use 128 units and return the sequences)
  27. X = Dropout(0.8)(X) # dropout (use 0.8)
  28. X = BatchNormalization()(X) # Batch normalization
  29.  
  30. # Step 3: Second GRU Layer (≈4 lines)
  31. X = GRU(units = 128, return_sequences=True)(X) # GRU (use 128 units and return the sequences)
  32. X = Dropout(0.8)(X) # dropout (use 0.8)
  33. X = BatchNormalization()(X) # Batch normalization
  34. X = Dropout(0.8)(X) # dropout (use 0.8)
  35.  
  36. # Step 4: Time-distributed dense layer (≈1 line)
  37. X = TimeDistributed(Dense(1, activation = "sigmoid"))(X) # time distributed (sigmoid)
  38.  
  39. ### END CODE HERE ###
  40.  
  41. model = Model(inputs = X_input, outputs = X)
  42.  
  43. return model
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement