Advertisement
lorenzo_gatto

iCaRL - one epoch

Nov 8th, 2017
122
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 6.39 KB | None | 0 0
  1. Mixing the classes and putting them in batches of classes...
  2. Creating a validation set ...
  3. Batch of classes number 1 arrives ...
  4. Batch of classes 1 out of 10 batches
  5. Epoch 0
  6. 0.287272
  7. 0.00871001
  8. 0.00892173
  9. 0.00863384
  10. 0.00751852
  11. 0.00657171
  12. 0.00608974
  13. 0.00585746
  14. Training accuracy 0.062500
  15. 0.0057788
  16. 0.00560296
  17. 0.00555174
  18. 0.00548918
  19. 0.0055367
  20. 0.00539182
  21. 0.00533972
  22. 0.00537566
  23. Training accuracy 0.070312
  24. 0.00534246
  25. 0.00530634
  26. 0.00524906
  27. 0.00524764
  28. 0.00522196
  29. 0.00521024
  30. 0.00523701
  31. 0.00517737
  32. Training accuracy 0.078125
  33. 0.005154
  34. 0.00515257
  35. 0.00507729
  36. 0.0051164
  37. 0.0050586
  38. 0.00501388
  39. 0.00512017
  40. 0.00504586
  41. Training accuracy 0.085938
  42. 0.00494739
  43. 0.00499304
  44. 0.00497555
  45. 0.00495446
  46. 0.00496844
  47. 0.00494343
  48. 0.00493757
  49. 0.00485029
  50. Training accuracy 0.140625
  51. 0.00482917
  52. 0.00483164
  53. 0.00488351
  54. 0.00479883
  55. 0.00483863
  56. 0.00484456
  57. 0.0048152
  58. 0.00476418
  59. Training accuracy 0.078125
  60. 0.00475501
  61. 0.00481238
  62. 0.00475885
  63. 0.00478915
  64. 0.00470401
  65. 0.00471081
  66. 0.00468985
  67. 0.00466995
  68. Training accuracy 0.132812
  69. 0.00463978
  70. 0.00475154
  71. 0.00460605
  72. 0.0046251
  73. 0.00460757
  74. 0.00460384
  75. 0.00455722
  76. 0.00461336
  77. Training accuracy 0.156250
  78. 0.00457644
  79. 0.00455298
  80. 0.00454285
  81. 0.0045895
  82. 0.00455414
  83. 0.00451121
  84. 0.00456292
  85. 0.0045069
  86. Training accuracy 0.187500
  87. 0.00452433
  88. 0.00448638
  89. 0.00447541
  90. 0.00442919
  91. 0.00447134
  92. 0.0044896
  93. 0.0044824
  94. 0.00450531
  95. Training accuracy 0.195312
  96. 0.00448563
  97. 0.00442603
  98. 0.00439628
  99. 0.00439986
  100. 0.00439687
  101. 0.00440562
  102. 0.00435975
  103. 0.00443868
  104. Training accuracy 0.187500
  105. 0.00435689
  106. 0.00437406
  107. 0.00437823
  108. 0.00437819
  109. 0.00429873
  110. 0.00438045
  111. 0.00433074
  112. 0.0043012
  113. Training accuracy 0.242188
  114. 0.00433212
  115. Exemplars selection starting ...
  116. Computing theoretical class means for NCM and mean-of-exemplars for iCaRL ...
  117. Batch of classes number 2 arrives ...
  118. Batch of classes 2 out of 10 batches
  119. Epoch 0
  120. 0.011735
  121. 0.0121078
  122. 0.0111618
  123. 0.010403
  124. 0.00974831
  125. 0.00949552
  126. 0.00924931
  127. 0.00905704
  128. Training accuracy 0.062500
  129. 0.00905689
  130. 0.00911064
  131. 0.00898644
  132. 0.00896149
  133. 0.00897088
  134. 0.00886607
  135. 0.00886964
  136. 0.00882033
  137. Training accuracy 0.046875
  138. 0.0088617
  139. 0.00889832
  140. 0.00885294
  141. 0.0086567
  142. 0.00872421
  143. 0.0087801
  144. 0.00876649
  145. 0.00870548
  146. Training accuracy 0.085938
  147. 0.00875134
  148. 0.00867434
  149. 0.00867335
  150. 0.00865952
  151. 0.0087811
  152. 0.00874177
  153. 0.00869078
  154. 0.00861675
  155. Training accuracy 0.101562
  156. 0.00877293
  157. 0.00871357
  158. 0.00860829
  159. 0.00870467
  160. 0.00875316
  161. 0.00861962
  162. 0.00868724
  163. 0.00861359
  164. Training accuracy 0.078125
  165. 0.00863145
  166. 0.00872596
  167. 0.0085802
  168. 0.00867081
  169. 0.00861635
  170. 0.00861645
  171. 0.00867638
  172. 0.00864936
  173. Training accuracy 0.132812
  174. 0.00862112
  175. 0.00855318
  176. 0.00865119
  177. 0.00867549
  178. 0.00866487
  179. 0.00859771
  180. 0.00850684
  181. 0.00858965
  182. Training accuracy 0.140625
  183. 0.00859381
  184. 0.00856225
  185. 0.00861783
  186. 0.00853969
  187. 0.00860939
  188. 0.00851869
  189. 0.00861566
  190. 0.00856602
  191. Training accuracy 0.078125
  192. 0.00856246
  193. 0.00847558
  194. 0.0086258
  195. 0.00857576
  196. 0.00860878
  197. 0.00861267
  198. 0.00855802
  199. 0.00849134
  200. Training accuracy 0.148438
  201. 0.00858278
  202. 0.00856503
  203. 0.00848093
  204. 0.00856737
  205. 0.00860464
  206. 0.0084817
  207. 0.0085678
  208. 0.00853012
  209. Training accuracy 0.078125
  210. 0.00852588
  211. 0.00850274
  212. 0.00844964
  213. 0.00855854
  214. 0.00853108
  215. 0.00853306
  216. 0.00862732
  217. 0.00848812
  218. Training accuracy 0.179688
  219. 0.00852059
  220. 0.00847373
  221. 0.00846274
  222. 0.00853646
  223. 0.00850241
  224. 0.00846972
  225. 0.00858296
  226. 0.00844549
  227. Training accuracy 0.203125
  228. 0.00850263
  229. 0.00846575
  230. 0.00849627
  231. 0.00847889
  232. 0.0084808
  233. 0.00847772
  234. 0.00846598
  235. 0.00856872
  236. Training accuracy 0.156250
  237. 0.00856192
  238. 0.00842343
  239. 0.00854151
  240. 0.00842551
  241. 0.00848143
  242. 0.00842469
  243. 0.00835394
  244. Exemplars selection starting ...
  245. Computing theoretical class means for NCM and mean-of-exemplars for iCaRL ...
  246. Batch of classes number 3 arrives ...
  247. Batch of classes 3 out of 10 batches
  248. Epoch 0
  249. 0.0133068
  250. 0.0160411
  251. 0.0152802
  252. 0.0145554
  253. 0.0138945
  254. 0.0134506
  255. 0.013309
  256. 0.0132363
  257. Training accuracy 0.054688
  258. 0.0131409
  259. 0.0129834
  260. 0.0129979
  261. 0.0130395
  262. 0.0129674
  263. 0.012992
  264. 0.0129019
  265. 0.012907
  266. Training accuracy 0.046875
  267. 0.0129227
  268. 0.0129192
  269. 0.0128864
  270. 0.0128227
  271. 0.0128627
  272. 0.0127671
  273. 0.0127762
  274. 0.012841
  275. Training accuracy 0.085938
  276. 0.0127621
  277. 0.0127453
  278. 0.0127394
  279. 0.0127804
  280. 0.0127349
  281. 0.0127909
  282. 0.0126875
  283. 0.012652
  284. Training accuracy 0.125000
  285. 0.0127161
  286. 0.0127041
  287. 0.0126931
  288. 0.0127085
  289. 0.0126899
  290. 0.0126107
  291. 0.0127092
  292. 0.0126483
  293. Training accuracy 0.093750
  294. 0.0127544
  295. 0.0126553
  296. 0.0127003
  297. 0.0125799
  298. 0.0126261
  299. 0.0125648
  300. 0.0126154
  301. 0.0126282
  302. Training accuracy 0.132812
  303. 0.0126254
  304. 0.0127322
  305. 0.0126617
  306. 0.0126598
  307. 0.0125445
  308. 0.0126811
  309. 0.0125878
  310. 0.0126139
  311. Training accuracy 0.132812
  312. 0.0126788
  313. 0.0126547
  314. 0.0126342
  315. 0.0125765
  316. 0.0125175
  317. 0.0125667
  318. 0.0124526
  319. 0.0125487
  320. Training accuracy 0.195312
  321. 0.0125424
  322. 0.0126633
  323. 0.0125563
  324. 0.0125391
  325. 0.0126417
  326. 0.0125652
  327. 0.0125713
  328. 0.0125582
  329. Training accuracy 0.140625
  330. 0.0125737
  331. 0.012591
  332. 0.0125367
  333. 0.0125514
  334. 0.0125518
  335. 0.012547
  336. 0.0125682
  337. 0.0125115
  338. Training accuracy 0.179688
  339. 0.0125117
  340. 0.0126273
  341. 0.0126121
  342. 0.0125133
  343. 0.0125625
  344. 0.0125111
  345. 0.0124418
  346. 0.0125072
  347. Training accuracy 0.140625
  348. 0.0124712
  349. 0.0125143
  350. 0.0125299
  351. 0.0124889
  352. 0.0125016
  353. 0.0126104
  354. 0.0125008
  355. 0.0125031
  356. Training accuracy 0.140625
  357. 0.0124458
  358. 0.0124321
  359. 0.0125703
  360. 0.0124474
  361. 0.0125129
  362. 0.0124093
  363. 0.0125206
  364. 0.0124371
  365. Training accuracy 0.156250
  366. 0.0123995
  367. 0.012514
  368. 0.0124937
  369. 0.0124761
  370. 0.0124713
  371. 0.0124752
  372. 0.0125337
  373. 0.0125326
  374. Training accuracy 0.226562
  375. Exemplars selection starting ...
  376. Computing theoretical class means for NCM and mean-of-exemplars for iCaRL ...
  377. Batch of classes number 4 arrives ...
  378. Batch of classes 4 out of 10 batches
  379. Epoch 0
  380. 0.0150059
  381. 0.0206414
  382. 0.0200337
  383. 0.0192729
  384. 0.0186985
  385. 0.0182909
  386. 0.017981
  387. 0.0178339
  388. Training accuracy 0.015625
  389. 0.0176549
  390. 0.017499
  391. 0.0173934
  392. 0.0175482
  393. 0.017556
  394. 0.01739
  395. 0.0173688
  396. 0.0174655
  397. Training accuracy 0.015625
  398. 0.0174182
  399. 0.0174006
  400. 0.0173555
  401. 0.0173084
  402. 0.0172947
  403. 0.0173323
  404. 0.0173499
  405. 0.0172396
  406. Training accuracy 0.015625
  407. 0.017452
  408. 0.0173198
  409. 0.0172841
  410. 0.0172596
  411. 0.0172748
  412. 0.0173
  413. 0.0172711
  414. 0.0173558
  415. Training accuracy 0.062500
  416. 0.0172768
  417. 0.0172309
  418. 0.0172989
  419. 0.0173261
  420. 0.0172818
  421. 0.0172231
  422. 0.0172613
  423. 0.0173507
  424. Training accuracy 0.117188
  425. 0.017095
  426. 0.0172337
  427. 0.0172449
  428. 0.0171851
  429. 0.0171699
  430. 0.0172261
  431. 0.0171296
  432. 0.0171692
  433. Training accuracy 0.132812
  434. 0.0172217
  435. 0.0172001
  436. 0.0172492
  437. 0.0171694
  438. 0.0170979
  439. 0.0172475
  440. 0.01708
  441. 0.0172318
  442. Training accuracy 0.062500
  443. 0.0171684
  444. 0.0170606
  445. 0.0171323
  446. 0.017094
  447. 0.0170702
  448. 0.0171387
  449. 0.0171015
  450. 0.0172059
  451. Training accuracy 0.117188
  452. 0.0170693
  453. 0.0171517
  454. 0.0172013
  455. 0.0172022
  456. 0.0172451
  457. 0.0171009
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement