Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- no slurm id
- INFO:tensorflow:Mixed precision compatibility check (mixed_float16): OK
- Your GPU will likely run quickly with dtype policy mixed_float16 as it has compute capability of at least 7.0. Your GPU: NVIDIA GeForce RTX 4070 Laptop GPU, compute capability 8.9
- number of covariates: 0
- Covariate columns found: []
- mode is classification
- weight_positive_class tf.Tensor(4.285, shape=(), dtype=float16)
- weight_negative_class tf.Tensor(0.566, shape=(), dtype=float16)
- jobid = 6
- folder = GenNet_experiment_6
- batchsize = 16
- lr = 0.001
- Creating networks from npz masks
- regression False
- Model: "model"
- __________________________________________________________________________________________________
- Layer (type) Output Shape Param # Connected to
- ==================================================================================================
- input_layer (InputLayer) [(None, 12746306)] 0 []
- reshape (Reshape) (None, 12746306, 1) 0 ['input_layer[0][0]']
- LocallyDirected_0 (Locally (None, 6973717, 1) 1972002 ['reshape[0][0]']
- Directed1D) 3
- activation (Activation) (None, 6973717, 1) 0 ['LocallyDirected_0[0][0]']
- batch_normalization (Batch (None, 6973717, 1) 2 ['activation[0][0]']
- Normalization)
- flatten (Flatten) (None, 6973717) 0 ['batch_normalization[0][0]']
- output_layer (Dense) (None, 1) 6973718 ['flatten[0][0]']
- inputs_cov (InputLayer) [(None, 0)] 0 []
- activation_1 (Activation) (None, 1) 0 ['output_layer[0][0]']
- ==================================================================================================
- Total params: 26693743 (101.83 MB)
- Trainable params: 26693741 (101.83 MB)
- Non-trainable params: 2 (8.00 Byte)
- __________________________________________________________________________________________________
- None
- Start training from scratch
- Epoch 1/5
- Y_TRUE: [1. 1. 0. 0. 1. 0. 0. 1. 0. 0. 0. 0. 0. 0. 1. 0.]
- Y_PRED AFTER CLIPPING: [0.505 0.5083 0.506 0.508 0.508 0.509 0.508 0.5103 0.5117 0.509
- 0.508 0.5107 0.5073 0.5073 0.5093 0.505 ]
- ****************************************************************************************************
- BCE: [0.6836 0.677 0.705 0.709 0.6777 0.711 0.709 0.673 0.717 0.711
- 0.709 0.715 0.708 0.708 0.675 0.703 ]
- ****************************************************************************************************
- WEIGHTED_BCE: 1.20898438
- 1/209 [..............................] - ETA: 9:02:32 - loss: inf - accuracy: 0.3125 - sensitivity: 1.0000 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
- 15.945 15.945 15.945 15.945 15.945 15.945]
- ****************************************************************************************************
- WEIGHTED_BCE: 9.0234375
- 2/209 [..............................] - ETA: 33:27 - loss: inf - accuracy: 0.1562 - sensitivity: 0.5000 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 15.945 -0. 15.945 15.945 15.945 15.945 15.945 15.945
- 15.945 15.945 15.945 15.945 15.945 15.945]
- ****************************************************************************************************
- WEIGHTED_BCE: 11.9296875
- 3/209 [..............................] - ETA: 24:31 - loss: inf - accuracy: 0.1250 - sensitivity: 0.6667 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
- 15.945 15.945 15.945 15.945 15.945 15.945]
- ****************************************************************************************************
- WEIGHTED_BCE: 9.0234375
- 4/209 [..............................] - ETA: 21:21 - loss: inf - accuracy: 0.0938 - sensitivity: 0.5000 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
- 15.945 15.945 15.945 15.945 15.945 15.945]
- ****************************************************************************************************
- WEIGHTED_BCE: 9.0234375
- 5/209 [..............................] - ETA: 20:03 - loss: inf - accuracy: 0.0750 - sensitivity: 0.4000 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 1.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
- 15.945 15.945 15.945 -0. 15.945 -0. ]
- ****************************************************************************************************
- WEIGHTED_BCE: 14.3828125
- 6/209 [..............................] - ETA: 19:12 - loss: inf - accuracy: 0.0833 - sensitivity: 0.5000 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 15.945 -0. 15.945 15.945 15.945 15.945 15.945 15.945
- 15.945 15.945 15.945 15.945 15.945 15.945]
- ****************************************************************************************************
- WEIGHTED_BCE: 11.9296875
- 7/209 [>.............................] - ETA: 18:25 - loss: inf - accuracy: 0.0804 - sensitivity: 0.5714 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 0. 0. 0. 1. 0. 1. 1. 0. 0. 0. 0. 0. 0. 0.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 15.945 15.945 15.945 -0. 15.945 -0. -0. 15.945
- 15.945 15.945 15.945 15.945 15.945 15.945]
- ****************************************************************************************************
- WEIGHTED_BCE: 16.359375
- 8/209 [>.............................] - ETA: 19:12 - loss: inf - accuracy: 0.0938 - sensitivity: 0.6250 - specificity: 0.0000e+00
- Y_TRUE: [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
- Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
- ****************************************************************************************************
- BCE: [15.945 15.945 -0. 15.945 15.945 15.945 15.945 15.945 15.945 15.945
- 15.945 15.945 15.945 15.945 15.945 15.945]
- ****************************************************************************************************
- WEIGHTED_BCE: 11.9296875
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement