Advertisement
lamiastella

loocv results 2 epochs, 2 classes, 10 images

Nov 21st, 2018
398
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. $ python exp_loocv.py
  2. Using sample 0 as test data
  3. Batch 0
  4. Epoch 0/1
  5. ----------
  6. train Loss: 0.0911 Acc: 0.0000
  7.  
  8. Epoch 1/1
  9. ----------
  10. train Loss: 0.0593 Acc: 0.1000
  11.  
  12. Training complete in 0m 0s
  13. Batch 1
  14. Epoch 0/1
  15. ----------
  16. train Loss: 0.1364 Acc: 0.0000
  17.  
  18. Epoch 1/1
  19. ----------
  20. train Loss: 0.1398 Acc: 0.0000
  21.  
  22. Training complete in 0m 0s
  23. Batch 2
  24. Epoch 0/1
  25. ----------
  26. train Loss: 0.1051 Acc: 0.0000
  27.  
  28. Epoch 1/1
  29. ----------
  30. train Loss: 0.0457 Acc: 0.1000
  31.  
  32. Training complete in 0m 0s
  33. Batch 3
  34. Epoch 0/1
  35. ----------
  36. train Loss: 0.1923 Acc: 0.0000
  37.  
  38. Epoch 1/1
  39. ----------
  40. train Loss: 0.2199 Acc: 0.0000
  41.  
  42. Training complete in 0m 0s
  43. Batch 4
  44. Epoch 0/1
  45. ----------
  46. train Loss: 0.0115 Acc: 0.1000
  47.  
  48. Epoch 1/1
  49. ----------
  50. train Loss: 0.0118 Acc: 0.1000
  51.  
  52. Training complete in 0m 0s
  53. Batch 5
  54. Epoch 0/1
  55. ----------
  56. train Loss: 0.0115 Acc: 0.1000
  57.  
  58. Epoch 1/1
  59. ----------
  60. train Loss: 0.0116 Acc: 0.1000
  61.  
  62. Training complete in 0m 0s
  63. Batch 6
  64. Epoch 0/1
  65. ----------
  66. train Loss: 0.2212 Acc: 0.0000
  67.  
  68. Epoch 1/1
  69. ----------
  70. train Loss: 0.2131 Acc: 0.0000
  71.  
  72. Training complete in 0m 0s
  73. Batch 7
  74. Epoch 0/1
  75. ----------
  76. train Loss: 0.0138 Acc: 0.1000
  77.  
  78. Epoch 1/1
  79. ----------
  80. train Loss: 0.0139 Acc: 0.1000
  81.  
  82. Training complete in 0m 0s
  83. Batch 8
  84. Epoch 0/1
  85. ----------
  86. train Loss: 0.2005 Acc: 0.0000
  87.  
  88. Epoch 1/1
  89. ----------
  90. train Loss: 0.1990 Acc: 0.0000
  91.  
  92. Training complete in 0m 0s
  93. Using sample 1 as test data
  94. Batch 0
  95. Epoch 0/1
  96. ----------
  97. train Loss: 0.1984 Acc: 0.0000
  98.  
  99. Epoch 1/1
  100. ----------
  101. train Loss: 0.1958 Acc: 0.0000
  102.  
  103. Training complete in 0m 0s
  104. Batch 1
  105. Epoch 0/1
  106. ----------
  107. train Loss: 0.0163 Acc: 0.1000
  108.  
  109. Epoch 1/1
  110. ----------
  111. train Loss: 0.0168 Acc: 0.1000
  112.  
  113. Training complete in 0m 0s
  114. Batch 2
  115. Epoch 0/1
  116. ----------
  117. train Loss: 0.0169 Acc: 0.1000
  118.  
  119. Epoch 1/1
  120. ----------
  121. train Loss: 0.0170 Acc: 0.1000
  122.  
  123. Training complete in 0m 0s
  124. Batch 3
  125. Epoch 0/1
  126. ----------
  127. train Loss: 0.1908 Acc: 0.0000
  128.  
  129. Epoch 1/1
  130. ----------
  131. train Loss: 0.1905 Acc: 0.0000
  132.  
  133. Training complete in 0m 0s
  134. Batch 4
  135. Epoch 0/1
  136. ----------
  137. train Loss: 0.0164 Acc: 0.1000
  138.  
  139. Epoch 1/1
  140. ----------
  141. train Loss: 0.0164 Acc: 0.1000
  142.  
  143. Training complete in 0m 0s
  144. Batch 5
  145. Epoch 0/1
  146. ----------
  147. train Loss: 0.0169 Acc: 0.1000
  148.  
  149. Epoch 1/1
  150. ----------
  151. train Loss: 0.0169 Acc: 0.1000
  152.  
  153. Training complete in 0m 0s
  154. Batch 6
  155. Epoch 0/1
  156. ----------
  157. train Loss: 0.0166 Acc: 0.1000
  158.  
  159. Epoch 1/1
  160. ----------
  161. train Loss: 0.0166 Acc: 0.1000
  162.  
  163. Training complete in 0m 0s
  164. Batch 7
  165. Epoch 0/1
  166. ----------
  167. train Loss: 0.1880 Acc: 0.0000
  168.  
  169. Epoch 1/1
  170. ----------
  171. train Loss: 0.1880 Acc: 0.0000
  172.  
  173. Training complete in 0m 0s
  174. Batch 8
  175. Epoch 0/1
  176. ----------
  177. train Loss: 0.1839 Acc: 0.0000
  178.  
  179. Epoch 1/1
  180. ----------
  181. train Loss: 0.1838 Acc: 0.0000
  182.  
  183. Training complete in 0m 0s
  184. Using sample 2 as test data
  185. Batch 0
  186. Epoch 0/1
  187. ----------
  188. train Loss: 0.0167 Acc: 0.1000
  189.  
  190. Epoch 1/1
  191. ----------
  192. train Loss: 0.0167 Acc: 0.1000
  193.  
  194. Training complete in 0m 0s
  195. Batch 1
  196. Epoch 0/1
  197. ----------
  198. train Loss: 0.1875 Acc: 0.0000
  199.  
  200. Epoch 1/1
  201. ----------
  202. train Loss: 0.1875 Acc: 0.0000
  203.  
  204. Training complete in 0m 0s
  205. Batch 2
  206. Epoch 0/1
  207. ----------
  208. train Loss: 0.0164 Acc: 0.1000
  209.  
  210. Epoch 1/1
  211. ----------
  212. train Loss: 0.0164 Acc: 0.1000
  213.  
  214. Training complete in 0m 0s
  215. Batch 3
  216. Epoch 0/1
  217. ----------
  218. train Loss: 0.1868 Acc: 0.0000
  219.  
  220. Epoch 1/1
  221. ----------
  222. train Loss: 0.1868 Acc: 0.0000
  223.  
  224. Training complete in 0m 0s
  225. Batch 4
  226. Epoch 0/1
  227. ----------
  228. train Loss: 0.0167 Acc: 0.1000
  229.  
  230. Epoch 1/1
  231. ----------
  232. train Loss: 0.0167 Acc: 0.1000
  233.  
  234. Training complete in 0m 0s
  235. Batch 5
  236. Epoch 0/1
  237. ----------
  238. train Loss: 0.0167 Acc: 0.1000
  239.  
  240. Epoch 1/1
  241. ----------
  242. train Loss: 0.0167 Acc: 0.1000
  243.  
  244. Training complete in 0m 0s
  245. Batch 6
  246. Epoch 0/1
  247. ----------
  248. train Loss: 0.1832 Acc: 0.0000
  249.  
  250. Epoch 1/1
  251. ----------
  252. train Loss: 0.1832 Acc: 0.0000
  253.  
  254. Training complete in 0m 0s
  255. Batch 7
  256. Epoch 0/1
  257. ----------
  258. train Loss: 0.1855 Acc: 0.0000
  259.  
  260. Epoch 1/1
  261. ----------
  262. train Loss: 0.1855 Acc: 0.0000
  263.  
  264. Training complete in 0m 0s
  265. Batch 8
  266. Epoch 0/1
  267. ----------
  268. train Loss: 0.0168 Acc: 0.1000
  269.  
  270. Epoch 1/1
  271. ----------
  272. train Loss: 0.0168 Acc: 0.1000
  273.  
  274. Training complete in 0m 0s
  275. Using sample 3 as test data
  276. Batch 0
  277. Epoch 0/1
  278. ----------
  279. train Loss: 0.0172 Acc: 0.1000
  280.  
  281. Epoch 1/1
  282. ----------
  283. train Loss: 0.0172 Acc: 0.1000
  284.  
  285. Training complete in 0m 0s
  286. Batch 1
  287. Epoch 0/1
  288. ----------
  289. train Loss: 0.1867 Acc: 0.0000
  290.  
  291. Epoch 1/1
  292. ----------
  293. train Loss: 0.1867 Acc: 0.0000
  294.  
  295. Training complete in 0m 0s
  296. Batch 2
  297. Epoch 0/1
  298. ----------
  299. train Loss: 0.1879 Acc: 0.0000
  300.  
  301. Epoch 1/1
  302. ----------
  303. train Loss: 0.1879 Acc: 0.0000
  304.  
  305. Training complete in 0m 0s
  306. Batch 3
  307. Epoch 0/1
  308. ----------
  309. train Loss: 0.0168 Acc: 0.1000
  310.  
  311. Epoch 1/1
  312. ----------
  313. train Loss: 0.0168 Acc: 0.1000
  314.  
  315. Training complete in 0m 0s
  316. Batch 4
  317. Epoch 0/1
  318. ----------
  319. train Loss: 0.0162 Acc: 0.1000
  320.  
  321. Epoch 1/1
  322. ----------
  323. train Loss: 0.0162 Acc: 0.1000
  324.  
  325. Training complete in 0m 0s
  326. Batch 5
  327. Epoch 0/1
  328. ----------
  329. train Loss: 0.0164 Acc: 0.1000
  330.  
  331. Epoch 1/1
  332. ----------
  333. train Loss: 0.0164 Acc: 0.1000
  334.  
  335. Training complete in 0m 0s
  336. Batch 6
  337. Epoch 0/1
  338. ----------
  339. train Loss: 0.0170 Acc: 0.1000
  340.  
  341. Epoch 1/1
  342. ----------
  343. train Loss: 0.0170 Acc: 0.1000
  344.  
  345. Training complete in 0m 0s
  346. Batch 7
  347. Epoch 0/1
  348. ----------
  349. train Loss: 0.1885 Acc: 0.0000
  350.  
  351. Epoch 1/1
  352. ----------
  353. train Loss: 0.1885 Acc: 0.0000
  354.  
  355. Training complete in 0m 0s
  356. Batch 8
  357. Epoch 0/1
  358. ----------
  359. train Loss: 0.1891 Acc: 0.0000
  360.  
  361. Epoch 1/1
  362. ----------
  363. train Loss: 0.1891 Acc: 0.0000
  364.  
  365. Training complete in 0m 0s
  366. Using sample 4 as test data
  367. Batch 0
  368. Epoch 0/1
  369. ----------
  370. train Loss: 0.0165 Acc: 0.1000
  371.  
  372. Epoch 1/1
  373. ----------
  374. train Loss: 0.0165 Acc: 0.1000
  375.  
  376. Training complete in 0m 0s
  377. Batch 1
  378. Epoch 0/1
  379. ----------
  380. train Loss: 0.0166 Acc: 0.1000
  381.  
  382. Epoch 1/1
  383. ----------
  384. train Loss: 0.0166 Acc: 0.1000
  385.  
  386. Training complete in 0m 0s
  387. Batch 2
  388. Epoch 0/1
  389. ----------
  390. train Loss: 0.0182 Acc: 0.1000
  391.  
  392. Epoch 1/1
  393. ----------
  394. train Loss: 0.0182 Acc: 0.1000
  395.  
  396. Training complete in 0m 0s
  397. Batch 3
  398. Epoch 0/1
  399. ----------
  400. train Loss: 0.1843 Acc: 0.0000
  401.  
  402. Epoch 1/1
  403. ----------
  404. train Loss: 0.1843 Acc: 0.0000
  405.  
  406. Training complete in 0m 0s
  407. Batch 4
  408. Epoch 0/1
  409. ----------
  410. train Loss: 0.1839 Acc: 0.0000
  411.  
  412. Epoch 1/1
  413. ----------
  414. train Loss: 0.1839 Acc: 0.0000
  415.  
  416. Training complete in 0m 0s
  417. Batch 5
  418. Epoch 0/1
  419. ----------
  420. train Loss: 0.1898 Acc: 0.0000
  421.  
  422. Epoch 1/1
  423. ----------
  424. train Loss: 0.1898 Acc: 0.0000
  425.  
  426. Training complete in 0m 0s
  427. Batch 6
  428. Epoch 0/1
  429. ----------
  430. train Loss: 0.1913 Acc: 0.0000
  431.  
  432. Epoch 1/1
  433. ----------
  434. train Loss: 0.1913 Acc: 0.0000
  435.  
  436. Training complete in 0m 0s
  437. Batch 7
  438. Epoch 0/1
  439. ----------
  440. train Loss: 0.0171 Acc: 0.1000
  441.  
  442. Epoch 1/1
  443. ----------
  444. train Loss: 0.0171 Acc: 0.1000
  445.  
  446. Training complete in 0m 0s
  447. Batch 8
  448. Epoch 0/1
  449. ----------
  450. train Loss: 0.0172 Acc: 0.1000
  451.  
  452. Epoch 1/1
  453. ----------
  454. train Loss: 0.0172 Acc: 0.1000
  455.  
  456. Training complete in 0m 0s
  457. Using sample 5 as test data
  458. Batch 0
  459. Epoch 0/1
  460. ----------
  461. train Loss: 0.0170 Acc: 0.1000
  462.  
  463. Epoch 1/1
  464. ----------
  465. train Loss: 0.0170 Acc: 0.1000
  466.  
  467. Training complete in 0m 0s
  468. Batch 1
  469. Epoch 0/1
  470. ----------
  471. train Loss: 0.1863 Acc: 0.0000
  472.  
  473. Epoch 1/1
  474. ----------
  475. train Loss: 0.1863 Acc: 0.0000
  476.  
  477. Training complete in 0m 0s
  478. Batch 2
  479. Epoch 0/1
  480. ----------
  481. train Loss: 0.1878 Acc: 0.0000
  482.  
  483. Epoch 1/1
  484. ----------
  485. train Loss: 0.1878 Acc: 0.0000
  486.  
  487. Training complete in 0m 0s
  488. Batch 3
  489. Epoch 0/1
  490. ----------
  491. train Loss: 0.1880 Acc: 0.0000
  492.  
  493. Epoch 1/1
  494. ----------
  495. train Loss: 0.1880 Acc: 0.0000
  496.  
  497. Training complete in 0m 0s
  498. Batch 4
  499. Epoch 0/1
  500. ----------
  501. train Loss: 0.0161 Acc: 0.1000
  502.  
  503. Epoch 1/1
  504. ----------
  505. train Loss: 0.0161 Acc: 0.1000
  506.  
  507. Training complete in 0m 0s
  508. Batch 5
  509. Epoch 0/1
  510. ----------
  511. train Loss: 0.1886 Acc: 0.0000
  512.  
  513. Epoch 1/1
  514. ----------
  515. train Loss: 0.1886 Acc: 0.0000
  516.  
  517. Training complete in 0m 0s
  518. Batch 6
  519. Epoch 0/1
  520. ----------
  521. train Loss: 0.1824 Acc: 0.0000
  522.  
  523. Epoch 1/1
  524. ----------
  525. train Loss: 0.1824 Acc: 0.0000
  526.  
  527. Training complete in 0m 0s
  528. Batch 7
  529. Epoch 0/1
  530. ----------
  531. train Loss: 0.0166 Acc: 0.1000
  532.  
  533. Epoch 1/1
  534. ----------
  535. train Loss: 0.0166 Acc: 0.1000
  536.  
  537. Training complete in 0m 0s
  538. Batch 8
  539. Epoch 0/1
  540. ----------
  541. train Loss: 0.0157 Acc: 0.1000
  542.  
  543. Epoch 1/1
  544. ----------
  545. train Loss: 0.0157 Acc: 0.1000
  546.  
  547. Training complete in 0m 0s
  548. Using sample 6 as test data
  549. Batch 0
  550. Epoch 0/1
  551. ----------
  552. train Loss: 0.1886 Acc: 0.0000
  553.  
  554. Epoch 1/1
  555. ----------
  556. train Loss: 0.1886 Acc: 0.0000
  557.  
  558. Training complete in 0m 0s
  559. Batch 1
  560. Epoch 0/1
  561. ----------
  562. train Loss: 0.0158 Acc: 0.1000
  563.  
  564. Epoch 1/1
  565. ----------
  566. train Loss: 0.0158 Acc: 0.1000
  567.  
  568. Training complete in 0m 0s
  569. Batch 2
  570. Epoch 0/1
  571. ----------
  572. train Loss: 0.1848 Acc: 0.0000
  573.  
  574. Epoch 1/1
  575. ----------
  576. train Loss: 0.1848 Acc: 0.0000
  577.  
  578. Training complete in 0m 0s
  579. Batch 3
  580. Epoch 0/1
  581. ----------
  582. train Loss: 0.0161 Acc: 0.1000
  583.  
  584. Epoch 1/1
  585. ----------
  586. train Loss: 0.0161 Acc: 0.1000
  587.  
  588. Training complete in 0m 0s
  589. Batch 4
  590. Epoch 0/1
  591. ----------
  592. train Loss: 0.1900 Acc: 0.0000
  593.  
  594. Epoch 1/1
  595. ----------
  596. train Loss: 0.1900 Acc: 0.0000
  597.  
  598. Training complete in 0m 0s
  599. Batch 5
  600. Epoch 0/1
  601. ----------
  602. train Loss: 0.0162 Acc: 0.1000
  603.  
  604. Epoch 1/1
  605. ----------
  606. train Loss: 0.0162 Acc: 0.1000
  607.  
  608. Training complete in 0m 0s
  609. Batch 6
  610. Epoch 0/1
  611. ----------
  612. train Loss: 0.1885 Acc: 0.0000
  613.  
  614. Epoch 1/1
  615. ----------
  616. train Loss: 0.1885 Acc: 0.0000
  617.  
  618. Training complete in 0m 0s
  619. Batch 7
  620. Epoch 0/1
  621. ----------
  622. train Loss: 0.1867 Acc: 0.0000
  623.  
  624. Epoch 1/1
  625. ----------
  626. train Loss: 0.1867 Acc: 0.0000
  627.  
  628. Training complete in 0m 0s
  629. Batch 8
  630. Epoch 0/1
  631. ----------
  632. train Loss: 0.0175 Acc: 0.1000
  633.  
  634. Epoch 1/1
  635. ----------
  636. train Loss: 0.0175 Acc: 0.1000
  637.  
  638. Training complete in 0m 0s
  639. Using sample 7 as test data
  640. Batch 0
  641. Epoch 0/1
  642. ----------
  643. train Loss: 0.0164 Acc: 0.1000
  644.  
  645. Epoch 1/1
  646. ----------
  647. train Loss: 0.0164 Acc: 0.1000
  648.  
  649. Training complete in 0m 0s
  650. Batch 1
  651. Epoch 0/1
  652. ----------
  653. train Loss: 0.1836 Acc: 0.0000
  654.  
  655. Epoch 1/1
  656. ----------
  657. train Loss: 0.1836 Acc: 0.0000
  658.  
  659. Training complete in 0m 0s
  660. Batch 2
  661. Epoch 0/1
  662. ----------
  663. train Loss: 0.1861 Acc: 0.0000
  664.  
  665. Epoch 1/1
  666. ----------
  667. train Loss: 0.1861 Acc: 0.0000
  668.  
  669. Training complete in 0m 0s
  670. Batch 3
  671. Epoch 0/1
  672. ----------
  673. train Loss: 0.1874 Acc: 0.0000
  674.  
  675. Epoch 1/1
  676. ----------
  677. train Loss: 0.1874 Acc: 0.0000
  678.  
  679. Training complete in 0m 0s
  680. Batch 4
  681. Epoch 0/1
  682. ----------
  683. train Loss: 0.1908 Acc: 0.0000
  684.  
  685. Epoch 1/1
  686. ----------
  687. train Loss: 0.1908 Acc: 0.0000
  688.  
  689. Training complete in 0m 0s
  690. Batch 5
  691. Epoch 0/1
  692. ----------
  693. train Loss: 0.1873 Acc: 0.0000
  694.  
  695. Epoch 1/1
  696. ----------
  697. train Loss: 0.1873 Acc: 0.0000
  698.  
  699. Training complete in 0m 0s
  700. Batch 6
  701. Epoch 0/1
  702. ----------
  703. train Loss: 0.0166 Acc: 0.1000
  704.  
  705. Epoch 1/1
  706. ----------
  707. train Loss: 0.0166 Acc: 0.1000
  708.  
  709. Training complete in 0m 0s
  710. Batch 7
  711. Epoch 0/1
  712. ----------
  713. train Loss: 0.0167 Acc: 0.1000
  714.  
  715. Epoch 1/1
  716. ----------
  717. train Loss: 0.0167 Acc: 0.1000
  718.  
  719. Training complete in 0m 0s
  720. Batch 8
  721. Epoch 0/1
  722. ----------
  723. train Loss: 0.0166 Acc: 0.1000
  724.  
  725. Epoch 1/1
  726. ----------
  727. train Loss: 0.0166 Acc: 0.1000
  728.  
  729. Training complete in 0m 0s
  730. Using sample 8 as test data
  731. Batch 0
  732. Epoch 0/1
  733. ----------
  734. train Loss: 0.0175 Acc: 0.1000
  735.  
  736. Epoch 1/1
  737. ----------
  738. train Loss: 0.0175 Acc: 0.1000
  739.  
  740. Training complete in 0m 0s
  741. Batch 1
  742. Epoch 0/1
  743. ----------
  744. train Loss: 0.1879 Acc: 0.0000
  745.  
  746. Epoch 1/1
  747. ----------
  748. train Loss: 0.1879 Acc: 0.0000
  749.  
  750. Training complete in 0m 0s
  751. Batch 2
  752. Epoch 0/1
  753. ----------
  754. train Loss: 0.1873 Acc: 0.0000
  755.  
  756. Epoch 1/1
  757. ----------
  758. train Loss: 0.1873 Acc: 0.0000
  759.  
  760. Training complete in 0m 0s
  761. Batch 3
  762. Epoch 0/1
  763. ----------
  764. train Loss: 0.0177 Acc: 0.1000
  765.  
  766. Epoch 1/1
  767. ----------
  768. train Loss: 0.0177 Acc: 0.1000
  769.  
  770. Training complete in 0m 0s
  771. Batch 4
  772. Epoch 0/1
  773. ----------
  774. train Loss: 0.0162 Acc: 0.1000
  775.  
  776. Epoch 1/1
  777. ----------
  778. train Loss: 0.0162 Acc: 0.1000
  779.  
  780. Training complete in 0m 0s
  781. Batch 5
  782. Epoch 0/1
  783. ----------
  784. train Loss: 0.1875 Acc: 0.0000
  785.  
  786. Epoch 1/1
  787. ----------
  788. train Loss: 0.1875 Acc: 0.0000
  789.  
  790. Training complete in 0m 0s
  791. Batch 6
  792. Epoch 0/1
  793. ----------
  794. train Loss: 0.0159 Acc: 0.1000
  795.  
  796. Epoch 1/1
  797. ----------
  798. train Loss: 0.0159 Acc: 0.1000
  799.  
  800. Training complete in 0m 0s
  801. Batch 7
  802. Epoch 0/1
  803. ----------
  804. train Loss: 0.1873 Acc: 0.0000
  805.  
  806. Epoch 1/1
  807. ----------
  808. train Loss: 0.1873 Acc: 0.0000
  809.  
  810. Training complete in 0m 0s
  811. Batch 8
  812. Epoch 0/1
  813. ----------
  814. train Loss: 0.1861 Acc: 0.0000
  815.  
  816. Epoch 1/1
  817. ----------
  818. train Loss: 0.1861 Acc: 0.0000
  819.  
  820. Training complete in 0m 0s
  821. Using sample 9 as test data
  822. Batch 0
  823. Epoch 0/1
  824. ----------
  825. train Loss: 0.0167 Acc: 0.1000
  826.  
  827. Epoch 1/1
  828. ----------
  829. train Loss: 0.0167 Acc: 0.1000
  830.  
  831. Training complete in 0m 0s
  832. Batch 1
  833. Epoch 0/1
  834. ----------
  835. train Loss: 0.1913 Acc: 0.0000
  836.  
  837. Epoch 1/1
  838. ----------
  839. train Loss: 0.1913 Acc: 0.0000
  840.  
  841. Training complete in 0m 0s
  842. Batch 2
  843. Epoch 0/1
  844. ----------
  845. train Loss: 0.0166 Acc: 0.1000
  846.  
  847. Epoch 1/1
  848. ----------
  849. train Loss: 0.0166 Acc: 0.1000
  850.  
  851. Training complete in 0m 0s
  852. Batch 3
  853. Epoch 0/1
  854. ----------
  855. train Loss: 0.1877 Acc: 0.0000
  856.  
  857. Epoch 1/1
  858. ----------
  859. train Loss: 0.1877 Acc: 0.0000
  860.  
  861. Training complete in 0m 0s
  862. Batch 4
  863. Epoch 0/1
  864. ----------
  865. train Loss: 0.1869 Acc: 0.0000
  866.  
  867. Epoch 1/1
  868. ----------
  869. train Loss: 0.1869 Acc: 0.0000
  870.  
  871. Training complete in 0m 0s
  872. Batch 5
  873. Epoch 0/1
  874. ----------
  875. train Loss: 0.1896 Acc: 0.0000
  876.  
  877. Epoch 1/1
  878. ----------
  879. train Loss: 0.1896 Acc: 0.0000
  880.  
  881. Training complete in 0m 0s
  882. Batch 6
  883. Epoch 0/1
  884. ----------
  885. train Loss: 0.0163 Acc: 0.1000
  886.  
  887. Epoch 1/1
  888. ----------
  889. train Loss: 0.0163 Acc: 0.1000
  890.  
  891. Training complete in 0m 0s
  892. Batch 7
  893. Epoch 0/1
  894. ----------
  895. train Loss: 0.1866 Acc: 0.0000
  896.  
  897. Epoch 1/1
  898. ----------
  899. train Loss: 0.1866 Acc: 0.0000
  900.  
  901. Training complete in 0m 0s
  902. Batch 8
  903. Epoch 0/1
  904. ----------
  905. train Loss: 0.0163 Acc: 0.1000
  906.  
  907. Epoch 1/1
  908. ----------
  909. train Loss: 0.0163 Acc: 0.1000
  910.  
  911. Training complete in 0m 0s
  912. loocv preds:  [tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0'), tensor([1], device='cuda:0')]
  913. loocv targets:  [0, 0, 0, 0, 0, 1, 1, 1, 1, 1]
  914. 0.5
  915. [[0 5]
  916.  [0 5]]
Advertisement
RAW Paste Data Copied
Advertisement