Advertisement
Guest User

Untitled

a guest
May 21st, 2024
118
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Python 8.64 KB | None | 0 0
  1. no slurm id
  2. INFO:tensorflow:Mixed precision compatibility check (mixed_float16): OK
  3. Your GPU will likely run quickly with dtype policy mixed_float16 as it has compute capability of at least 7.0. Your GPU: NVIDIA GeForce RTX 4070 Laptop GPU, compute capability 8.9
  4. number of covariates: 0
  5. Covariate columns found: []
  6. mode is classification
  7. weight_positive_class tf.Tensor(4.285, shape=(), dtype=float16)
  8. weight_negative_class tf.Tensor(0.566, shape=(), dtype=float16)
  9. jobid =  6
  10. folder = GenNet_experiment_6
  11. batchsize = 16
  12. lr = 0.001
  13. Creating networks from npz masks
  14. regression False
  15. Model: "model"
  16. __________________________________________________________________________________________________
  17.  Layer (type)                Output Shape                 Param #   Connected to                  
  18. ==================================================================================================
  19.  input_layer (InputLayer)    [(None, 12746306)]           0         []                            
  20.                                                                                                  
  21.  reshape (Reshape)           (None, 12746306, 1)          0         ['input_layer[0][0]']        
  22.                                                                                                  
  23.  LocallyDirected_0 (Locally  (None, 6973717, 1)           1972002   ['reshape[0][0]']            
  24.  Directed1D)                                              3                                      
  25.                                                                                                  
  26.  activation (Activation)     (None, 6973717, 1)           0         ['LocallyDirected_0[0][0]']  
  27.                                                                                                  
  28.  batch_normalization (Batch  (None, 6973717, 1)           2         ['activation[0][0]']          
  29.  Normalization)                                                                                  
  30.                                                                                                  
  31.  flatten (Flatten)           (None, 6973717)              0         ['batch_normalization[0][0]']
  32.                                                                                                  
  33.  output_layer (Dense)        (None, 1)                    6973718   ['flatten[0][0]']            
  34.                                                                                                  
  35.  inputs_cov (InputLayer)     [(None, 0)]                  0         []                            
  36.                                                                                                  
  37.  activation_1 (Activation)   (None, 1)                    0         ['output_layer[0][0]']        
  38.                                                                                                  
  39. ==================================================================================================
  40. Total params: 26693743 (101.83 MB)
  41. Trainable params: 26693741 (101.83 MB)
  42. Non-trainable params: 2 (8.00 Byte)
  43. __________________________________________________________________________________________________
  44. None
  45. Start training from scratch
  46. Epoch 1/5
  47.  
  48. Y_TRUE: [1. 1. 0. 0. 1. 0. 0. 1. 0. 0. 0. 0. 0. 0. 1. 0.]
  49.  
  50. Y_PRED AFTER CLIPPING: [0.505  0.5083 0.506  0.508  0.508  0.509  0.508  0.5103 0.5117 0.509
  51.  0.508  0.5107 0.5073 0.5073 0.5093 0.505 ]
  52.  
  53. ****************************************************************************************************
  54.  
  55. BCE: [0.6836 0.677  0.705  0.709  0.6777 0.711  0.709  0.673  0.717  0.711
  56.  0.709  0.715  0.708  0.708  0.675  0.703 ]
  57. ****************************************************************************************************
  58.  
  59. WEIGHTED_BCE: 1.20898438
  60.  
  61.  1/209 [..............................] - ETA: 9:02:32 - loss: inf - accuracy: 0.3125 - sensitivity: 1.0000 - specificity: 0.0000e+00
  62. Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
  63.  
  64. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  65.  
  66. ****************************************************************************************************
  67.  
  68. BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
  69.  15.945 15.945 15.945 15.945 15.945 15.945]
  70. ****************************************************************************************************
  71.  
  72. WEIGHTED_BCE: 9.0234375
  73.   2/209 [..............................] - ETA: 33:27 - loss: inf - accuracy: 0.1562 - sensitivity: 0.5000 - specificity: 0.0000e+00  
  74. Y_TRUE: [0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
  75.  
  76. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  77.  
  78. ****************************************************************************************************
  79.  
  80. BCE: [15.945 15.945 15.945 -0.    15.945 15.945 15.945 15.945 15.945 15.945
  81.  15.945 15.945 15.945 15.945 15.945 15.945]
  82. ****************************************************************************************************
  83.  
  84. WEIGHTED_BCE: 11.9296875
  85.   3/209 [..............................] - ETA: 24:31 - loss: inf - accuracy: 0.1250 - sensitivity: 0.6667 - specificity: 0.0000e+00
  86. Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
  87.  
  88. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  89.  
  90. ****************************************************************************************************
  91.  
  92. BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
  93.  15.945 15.945 15.945 15.945 15.945 15.945]
  94. ****************************************************************************************************
  95.  
  96. WEIGHTED_BCE: 9.0234375
  97.   4/209 [..............................] - ETA: 21:21 - loss: inf - accuracy: 0.0938 - sensitivity: 0.5000 - specificity: 0.0000e+00
  98. Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
  99.  
  100. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  101.  
  102. ****************************************************************************************************
  103.  
  104. BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
  105.  15.945 15.945 15.945 15.945 15.945 15.945]
  106. ****************************************************************************************************
  107.  
  108. WEIGHTED_BCE: 9.0234375
  109.   5/209 [..............................] - ETA: 20:03 - loss: inf - accuracy: 0.0750 - sensitivity: 0.4000 - specificity: 0.0000e+00
  110. Y_TRUE: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 1.]
  111.  
  112. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  113.  
  114. ****************************************************************************************************
  115.  
  116. BCE: [15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945 15.945
  117.  15.945 15.945 15.945 -0.    15.945 -0.   ]
  118. ****************************************************************************************************
  119.  
  120. WEIGHTED_BCE: 14.3828125
  121.   6/209 [..............................] - ETA: 19:12 - loss: inf - accuracy: 0.0833 - sensitivity: 0.5000 - specificity: 0.0000e+00
  122. Y_TRUE: [0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
  123.  
  124. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  125.  
  126. ****************************************************************************************************
  127.  
  128. BCE: [15.945 15.945 15.945 -0.    15.945 15.945 15.945 15.945 15.945 15.945
  129.  15.945 15.945 15.945 15.945 15.945 15.945]
  130. ****************************************************************************************************
  131.  
  132. WEIGHTED_BCE: 11.9296875
  133.   7/209 [>.............................] - ETA: 18:25 - loss: inf - accuracy: 0.0804 - sensitivity: 0.5714 - specificity: 0.0000e+00
  134. Y_TRUE: [0. 0. 0. 0. 0. 1. 0. 1. 1. 0. 0. 0. 0. 0. 0. 0.]
  135.  
  136. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  137.  
  138. ****************************************************************************************************
  139.  
  140. BCE: [15.945 15.945 15.945 15.945 15.945 -0.    15.945 -0.    -0.    15.945
  141.  15.945 15.945 15.945 15.945 15.945 15.945]
  142. ****************************************************************************************************
  143.  
  144. WEIGHTED_BCE: 16.359375
  145.   8/209 [>.............................] - ETA: 19:12 - loss: inf - accuracy: 0.0938 - sensitivity: 0.6250 - specificity: 0.0000e+00
  146. Y_TRUE: [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
  147.  
  148. Y_PRED AFTER CLIPPING: [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
  149.  
  150. ****************************************************************************************************
  151.  
  152. BCE: [15.945 15.945 -0.    15.945 15.945 15.945 15.945 15.945 15.945 15.945
  153.  15.945 15.945 15.945 15.945 15.945 15.945]
  154. ****************************************************************************************************
  155.  
  156. WEIGHTED_BCE: 11.9296875
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement