Advertisement
Guest User

Untitled

a guest
Oct 23rd, 2019
128
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 22.00 KB | None | 0 0
  1. Training the HypE model...
  2. Number of training data points: 61911
  3. Starting training...
  4. iteration#: 1, loss: 1128.5969721078873
  5. iteration#: 2, loss: 589.3206909894943
  6. iteration#: 3, loss: 352.0712777376175
  7. iteration#: 4, loss: 246.19432146847248
  8. iteration#: 5, loss: 202.48553057014942
  9. iteration#: 6, loss: 165.68770626187325
  10. iteration#: 7, loss: 145.24890618771315
  11. iteration#: 8, loss: 132.29567590355873
  12. iteration#: 9, loss: 122.3706817701459
  13. iteration#: 10, loss: 110.28342608362436
  14. iteration#: 11, loss: 103.30189225077629
  15. iteration#: 12, loss: 98.75038635358214
  16. iteration#: 13, loss: 93.63530520349741
  17. iteration#: 14, loss: 92.06604382395744
  18. iteration#: 15, loss: 83.64566880837083
  19. iteration#: 16, loss: 79.78967126086354
  20. iteration#: 17, loss: 77.55824789777398
  21. iteration#: 18, loss: 78.40112666413188
  22. iteration#: 19, loss: 76.36364080756903
  23. iteration#: 20, loss: 73.5182331353426
  24. iteration#: 21, loss: 72.82811096310616
  25. iteration#: 22, loss: 68.37139315903187
  26. iteration#: 23, loss: 67.71940031647682
  27. iteration#: 24, loss: 66.86391191929579
  28. iteration#: 25, loss: 66.00029823556542
  29. iteration#: 26, loss: 63.77029738947749
  30. iteration#: 27, loss: 62.39784202352166
  31. iteration#: 28, loss: 60.96891175955534
  32. iteration#: 29, loss: 61.461283415555954
  33. iteration#: 30, loss: 57.59357615932822
  34. iteration#: 31, loss: 56.80059587955475
  35. iteration#: 32, loss: 55.279016226530075
  36. iteration#: 33, loss: 53.79471708461642
  37. iteration#: 34, loss: 52.90129294246435
  38. iteration#: 35, loss: 55.00731261074543
  39. iteration#: 36, loss: 53.071109656244516
  40. iteration#: 37, loss: 53.67683732137084
  41. iteration#: 38, loss: 54.17810820043087
  42. iteration#: 39, loss: 49.99514580518007
  43. iteration#: 40, loss: 49.96600095927715
  44. iteration#: 41, loss: 49.71589883044362
  45. iteration#: 42, loss: 50.49418694153428
  46. iteration#: 43, loss: 47.33111986704171
  47. iteration#: 44, loss: 47.55682394653559
  48. iteration#: 45, loss: 47.433820594102144
  49. iteration#: 46, loss: 45.79797434806824
  50. iteration#: 47, loss: 45.78519377857447
  51. iteration#: 48, loss: 46.44497795775533
  52. iteration#: 49, loss: 43.37494556233287
  53. iteration#: 50, loss: 44.099109657108784
  54. iteration#: 51, loss: 45.47817849740386
  55. iteration#: 52, loss: 44.225241988897324
  56. iteration#: 53, loss: 42.102620895951986
  57. iteration#: 54, loss: 44.138226084411144
  58. iteration#: 55, loss: 41.6612194776535
  59. iteration#: 56, loss: 42.043484246358275
  60. iteration#: 57, loss: 41.97770299948752
  61. iteration#: 58, loss: 42.43217799440026
  62. iteration#: 59, loss: 40.212553314864635
  63. iteration#: 60, loss: 41.666214637458324
  64. iteration#: 61, loss: 40.84178464487195
  65. iteration#: 62, loss: 38.91881050914526
  66. iteration#: 63, loss: 39.607658829540014
  67. iteration#: 64, loss: 40.6425687558949
  68. iteration#: 65, loss: 39.766071401536465
  69. iteration#: 66, loss: 40.13560397922993
  70. iteration#: 67, loss: 38.51617764309049
  71. iteration#: 68, loss: 39.09016981907189
  72. iteration#: 69, loss: 37.40770496428013
  73. iteration#: 70, loss: 39.52282164245844
  74. iteration#: 71, loss: 37.689951818436384
  75. iteration#: 72, loss: 38.63494444359094
  76. iteration#: 73, loss: 35.92030730098486
  77. iteration#: 74, loss: 38.22083969414234
  78. iteration#: 75, loss: 36.69814841076732
  79. iteration#: 76, loss: 34.85340274870396
  80. iteration#: 77, loss: 37.02121592871845
  81. iteration#: 78, loss: 37.17175584100187
  82. iteration#: 79, loss: 34.77896904479712
  83. iteration#: 80, loss: 35.08971842750907
  84. iteration#: 81, loss: 36.15607845596969
  85. iteration#: 82, loss: 35.563974909484386
  86. iteration#: 83, loss: 37.93343322724104
  87. iteration#: 84, loss: 35.19352715462446
  88. iteration#: 85, loss: 34.01389496028423
  89. iteration#: 86, loss: 34.73526332899928
  90. iteration#: 87, loss: 35.384004667401314
  91. iteration#: 88, loss: 35.128996040672064
  92. iteration#: 89, loss: 35.14112612232566
  93. iteration#: 90, loss: 33.744155475869775
  94. iteration#: 91, loss: 34.359025252982974
  95. iteration#: 92, loss: 34.5803159289062
  96. iteration#: 93, loss: 34.566614117473364
  97. iteration#: 94, loss: 33.158173859119415
  98. iteration#: 95, loss: 34.567622024565935
  99. iteration#: 96, loss: 33.56074149161577
  100. iteration#: 97, loss: 33.440386816859245
  101. iteration#: 98, loss: 33.15666122082621
  102. iteration#: 99, loss: 33.35540076158941
  103. iteration#: 100, loss: 32.31053277011961
  104. validation:
  105. Raw setting:
  106. Hit@1 = 0.0
  107. Hit@3 = 0.2291604272736348
  108. Hit@10 = 0.405884995507637
  109. MR = 937.4299440950384
  110. MRR = 0.14917899870352636
  111.  
  112. Fil setting:
  113. Hit@1 = 0.27979934112009586
  114. Hit@3 = 0.39061096136567836
  115. Hit@10 = 0.5278027353499052
  116. MR = 913.5135769192373
  117. MRR = 0.36328344097470155
  118.  
  119. iteration#: 101, loss: 33.26325755007565
  120. iteration#: 102, loss: 34.22918685525656
  121. iteration#: 103, loss: 31.72119122557342
  122. iteration#: 104, loss: 34.25393953919411
  123. iteration#: 105, loss: 35.19353667087853
  124. iteration#: 106, loss: 31.300839794799685
  125. iteration#: 107, loss: 32.417203683406115
  126. iteration#: 108, loss: 32.92384012788534
  127. iteration#: 109, loss: 31.86550393141806
  128. iteration#: 110, loss: 31.662774812430143
  129. iteration#: 111, loss: 31.615627698600292
  130. iteration#: 112, loss: 30.160429138690233
  131. iteration#: 113, loss: 30.863088745623827
  132. iteration#: 114, loss: 31.117492131888866
  133. iteration#: 115, loss: 30.772704727947712
  134. iteration#: 116, loss: 30.377160154283047
  135. iteration#: 117, loss: 30.72225643042475
  136. iteration#: 118, loss: 31.225141864269972
  137. iteration#: 119, loss: 31.050309631973505
  138. iteration#: 120, loss: 31.19096688553691
  139. iteration#: 121, loss: 29.92217805981636
  140. iteration#: 122, loss: 30.261887602508068
  141. iteration#: 123, loss: 31.232261735014617
  142. iteration#: 124, loss: 29.986971170641482
  143. iteration#: 125, loss: 28.92918374761939
  144. iteration#: 126, loss: 29.307744782418013
  145. iteration#: 127, loss: 29.513292593881488
  146. iteration#: 128, loss: 30.21643714979291
  147. iteration#: 129, loss: 30.159386394545436
  148. iteration#: 130, loss: 29.226475175470114
  149. iteration#: 131, loss: 30.06484153866768
  150. iteration#: 132, loss: 30.340365203097463
  151. iteration#: 133, loss: 29.6525942645967
  152. iteration#: 134, loss: 28.415171306580305
  153. iteration#: 135, loss: 28.65402167290449
  154. iteration#: 136, loss: 30.443494034931064
  155. iteration#: 137, loss: 28.39379720017314
  156. iteration#: 138, loss: 31.505824368912727
  157. iteration#: 139, loss: 31.484795162454247
  158. iteration#: 140, loss: 28.738601291552186
  159. iteration#: 141, loss: 30.208784237504005
  160. iteration#: 142, loss: 28.228824887424707
  161. iteration#: 143, loss: 28.55437532067299
  162. iteration#: 144, loss: 28.28537051472813
  163. iteration#: 145, loss: 27.25414576753974
  164. iteration#: 146, loss: 29.058412563055754
  165. iteration#: 147, loss: 28.984738670289516
  166. iteration#: 148, loss: 28.05106611084193
  167. iteration#: 149, loss: 28.65531910955906
  168. iteration#: 150, loss: 27.327295269817114
  169. iteration#: 151, loss: 28.951372979208827
  170. iteration#: 152, loss: 28.768770076334476
  171. iteration#: 153, loss: 26.743937350809574
  172. iteration#: 154, loss: 27.564233478158712
  173. iteration#: 155, loss: 27.864890094846487
  174. iteration#: 156, loss: 30.39787882193923
  175. iteration#: 157, loss: 28.431658174842596
  176. iteration#: 158, loss: 27.678705608472228
  177. iteration#: 159, loss: 27.66184150055051
  178. iteration#: 160, loss: 26.19655105425045
  179. iteration#: 161, loss: 28.142703719437122
  180. iteration#: 162, loss: 27.079211939126253
  181. iteration#: 163, loss: 30.366659447550774
  182. iteration#: 164, loss: 27.64105688035488
  183. iteration#: 165, loss: 27.564809288363904
  184. iteration#: 166, loss: 26.746821008622646
  185. iteration#: 167, loss: 28.730659483000636
  186. iteration#: 168, loss: 27.27083331719041
  187. iteration#: 169, loss: 26.759906120598316
  188. iteration#: 170, loss: 26.359628465026617
  189. iteration#: 171, loss: 26.729565834626555
  190. iteration#: 172, loss: 27.446469645947218
  191. iteration#: 173, loss: 27.38310758769512
  192. iteration#: 174, loss: 27.526637017726898
  193. iteration#: 175, loss: 26.98622157610953
  194. iteration#: 176, loss: 26.761791734956205
  195. iteration#: 177, loss: 27.418844245374203
  196. iteration#: 178, loss: 26.305560232140124
  197. iteration#: 179, loss: 27.326188180595636
  198. iteration#: 180, loss: 26.088891372084618
  199. iteration#: 181, loss: 26.502641008235514
  200. iteration#: 182, loss: 26.568427070975304
  201. iteration#: 183, loss: 26.10986440628767
  202. iteration#: 184, loss: 27.35618031769991
  203. iteration#: 185, loss: 26.71378854289651
  204. iteration#: 186, loss: 26.792690694332123
  205. iteration#: 187, loss: 26.765573816373944
  206. iteration#: 188, loss: 27.189598760567605
  207. iteration#: 189, loss: 27.012643849477172
  208. iteration#: 190, loss: 25.817803516983986
  209. iteration#: 191, loss: 25.85440818965435
  210. iteration#: 192, loss: 25.07075311988592
  211. iteration#: 193, loss: 23.886949023231864
  212. iteration#: 194, loss: 26.634819438681006
  213. iteration#: 195, loss: 25.313688285648823
  214. iteration#: 196, loss: 26.744353350251913
  215. iteration#: 197, loss: 26.14979605935514
  216. iteration#: 198, loss: 26.016125928144902
  217. iteration#: 199, loss: 25.724481565877795
  218. iteration#: 200, loss: 25.14638690277934
  219. validation:
  220. Raw setting:
  221. Hit@1 = 0.0
  222. Hit@3 = 0.23649795347908556
  223. Hit@10 = 0.42041030248577416
  224. MR = 1000.9781371668164
  225. MRR = 0.1539652263499597
  226.  
  227. Fil setting:
  228. Hit@1 = 0.3131676150544075
  229. Hit@3 = 0.4312418887890586
  230. Hit@10 = 0.5641160027952481
  231. MR = 976.2180043925327
  232. MRR = 0.3979868198955892
  233.  
  234. iteration#: 201, loss: 27.24712151940912
  235. iteration#: 202, loss: 24.7819806933403
  236. iteration#: 203, loss: 26.716864474117756
  237. iteration#: 204, loss: 24.220598552376032
  238. iteration#: 205, loss: 25.058776330202818
  239. iteration#: 206, loss: 24.889723427593708
  240. iteration#: 207, loss: 25.093551199883223
  241. iteration#: 208, loss: 25.470895286649466
  242. iteration#: 209, loss: 25.849243696779013
  243. iteration#: 210, loss: 24.411983223631978
  244. iteration#: 211, loss: 25.2604296291247
  245. iteration#: 212, loss: 25.062909573316574
  246. iteration#: 213, loss: 25.689380656927824
  247. iteration#: 214, loss: 24.776960257440805
  248. iteration#: 215, loss: 24.7105468865484
  249. iteration#: 216, loss: 25.115280451253057
  250. iteration#: 217, loss: 25.003646360710263
  251. iteration#: 218, loss: 24.386205823160708
  252. iteration#: 219, loss: 25.139376003295183
  253. iteration#: 220, loss: 26.026214078068733
  254. iteration#: 221, loss: 24.418052868917584
  255. iteration#: 222, loss: 25.533744409680367
  256. iteration#: 223, loss: 26.67623369395733
  257. iteration#: 224, loss: 25.56923023238778
  258. iteration#: 225, loss: 24.79012242332101
  259. iteration#: 226, loss: 24.617675449699163
  260. iteration#: 227, loss: 24.43231729604304
  261. iteration#: 228, loss: 24.549676839262247
  262. iteration#: 229, loss: 26.06411285791546
  263. iteration#: 230, loss: 24.40025589382276
  264. iteration#: 231, loss: 25.245934923179448
  265. iteration#: 232, loss: 25.283985276240855
  266. iteration#: 233, loss: 24.336508235894144
  267. iteration#: 234, loss: 24.490268025081605
  268. iteration#: 235, loss: 26.47953055659309
  269. iteration#: 236, loss: 25.421675144694746
  270. iteration#: 237, loss: 24.01611863076687
  271. iteration#: 238, loss: 24.482236433774233
  272. iteration#: 239, loss: 24.822535067796707
  273. iteration#: 240, loss: 24.617661265656352
  274. iteration#: 241, loss: 25.402254071086645
  275. iteration#: 242, loss: 25.182959680445492
  276. iteration#: 243, loss: 24.926196499727666
  277. iteration#: 244, loss: 24.1664194855839
  278. iteration#: 245, loss: 25.3267030864954
  279. iteration#: 246, loss: 25.069063547067344
  280. iteration#: 247, loss: 23.23500164039433
  281. iteration#: 248, loss: 24.93844872713089
  282. iteration#: 249, loss: 24.657083835452795
  283. iteration#: 250, loss: 24.02684534341097
  284. iteration#: 251, loss: 24.74254036694765
  285. iteration#: 252, loss: 23.778330402448773
  286. iteration#: 253, loss: 24.912817834876478
  287. iteration#: 254, loss: 25.329655833542347
  288. iteration#: 255, loss: 25.94052093103528
  289. iteration#: 256, loss: 24.734104819595814
  290. iteration#: 257, loss: 26.226534105837345
  291. iteration#: 258, loss: 23.75828493386507
  292. iteration#: 259, loss: 24.486495088785887
  293. iteration#: 260, loss: 24.302721105515957
  294. iteration#: 261, loss: 25.560199317522347
  295. iteration#: 262, loss: 25.861542949220166
  296. iteration#: 263, loss: 25.19694292731583
  297. iteration#: 264, loss: 24.488210739567876
  298. iteration#: 265, loss: 25.264429111033678
  299. iteration#: 266, loss: 24.15079783136025
  300. iteration#: 267, loss: 24.848060972988605
  301. iteration#: 268, loss: 24.85860359482467
  302. iteration#: 269, loss: 25.26806028187275
  303. iteration#: 270, loss: 24.723592843860388
  304. iteration#: 271, loss: 24.9519777931273
  305. iteration#: 272, loss: 24.12641540542245
  306. iteration#: 273, loss: 25.345917662605643
  307. iteration#: 274, loss: 24.698667232878506
  308. iteration#: 275, loss: 24.103543657809496
  309. iteration#: 276, loss: 24.678639722056687
  310. iteration#: 277, loss: 23.85308739542961
  311. iteration#: 278, loss: 25.21232662163675
  312. iteration#: 279, loss: 23.68872652295977
  313. iteration#: 280, loss: 24.280228855088353
  314. iteration#: 281, loss: 23.895063892006874
  315. iteration#: 282, loss: 24.12961891386658
  316. iteration#: 283, loss: 24.267005558591336
  317. iteration#: 284, loss: 24.854911517351866
  318. iteration#: 285, loss: 24.724048418924212
  319. iteration#: 286, loss: 24.230520529672503
  320. iteration#: 287, loss: 24.04946636594832
  321. iteration#: 288, loss: 24.340305379591882
  322. iteration#: 289, loss: 22.771409523207694
  323. iteration#: 290, loss: 22.645989569835365
  324. iteration#: 291, loss: 24.190760102123022
  325. iteration#: 292, loss: 24.118768325075507
  326. iteration#: 293, loss: 22.914921359624714
  327. iteration#: 294, loss: 25.218876730650663
  328. iteration#: 295, loss: 23.376327358186245
  329. iteration#: 296, loss: 23.65202672034502
  330. iteration#: 297, loss: 23.588180601596832
  331. iteration#: 298, loss: 23.46082061715424
  332. iteration#: 299, loss: 22.512943250127137
  333. iteration#: 300, loss: 24.99040351808071
  334. validation:
  335. Raw setting:
  336. Hit@1 = 0.0
  337. Hit@3 = 0.23851951682140363
  338. Hit@10 = 0.4250773684735949
  339. MR = 999.751547369472
  340. MRR = 0.15537151787075468
  341.  
  342. Fil setting:
  343. Hit@1 = 0.3315613457122891
  344. Hit@3 = 0.45130777677947487
  345. Hit@10 = 0.5833333333333334
  346. MR = 973.5657132874114
  347. MRR = 0.416801066038361
  348.  
  349. iteration#: 301, loss: 23.46126427873969
  350. iteration#: 302, loss: 23.080572795122862
  351. iteration#: 303, loss: 23.627124927937984
  352. iteration#: 304, loss: 22.848930288106203
  353. iteration#: 305, loss: 22.69973354972899
  354. iteration#: 306, loss: 23.34152278956026
  355. iteration#: 307, loss: 22.907369073480368
  356. iteration#: 308, loss: 22.784663654863834
  357. iteration#: 309, loss: 24.20392457768321
  358. iteration#: 310, loss: 24.275926711969078
  359. iteration#: 311, loss: 23.126015905290842
  360. iteration#: 312, loss: 24.156770821660757
  361. iteration#: 313, loss: 22.9060525521636
  362. iteration#: 314, loss: 23.94069342687726
  363. iteration#: 315, loss: 22.368323352187872
  364. iteration#: 316, loss: 22.767602367326617
  365. iteration#: 317, loss: 23.37242553755641
  366. iteration#: 318, loss: 24.176541523076594
  367. iteration#: 319, loss: 23.689086553640664
  368. iteration#: 320, loss: 23.148982455953956
  369. iteration#: 321, loss: 22.40683679538779
  370. iteration#: 322, loss: 23.597058003302664
  371. iteration#: 323, loss: 23.193390090949833
  372. iteration#: 324, loss: 23.11877152696252
  373. iteration#: 325, loss: 21.720063833519816
  374. iteration#: 326, loss: 23.821586064994335
  375. iteration#: 327, loss: 22.570988729130477
  376. iteration#: 328, loss: 23.12326236255467
  377. iteration#: 329, loss: 23.54500874504447
  378. iteration#: 330, loss: 22.800207087770104
  379. iteration#: 331, loss: 22.296832972206175
  380. iteration#: 332, loss: 22.781476109288633
  381. iteration#: 333, loss: 22.800559541210532
  382. iteration#: 334, loss: 22.658214198425412
  383. iteration#: 335, loss: 23.009371127933264
  384. iteration#: 336, loss: 22.986757159233093
  385. iteration#: 337, loss: 23.880425991490483
  386. iteration#: 338, loss: 22.787995271384716
  387. iteration#: 339, loss: 23.881447231397033
  388. iteration#: 340, loss: 22.433480247855186
  389. iteration#: 341, loss: 22.915081072831526
  390. iteration#: 342, loss: 22.919296022038907
  391. iteration#: 343, loss: 23.584480277262628
  392. iteration#: 344, loss: 23.08296275511384
  393. iteration#: 345, loss: 22.462437914684415
  394. iteration#: 346, loss: 21.69634547457099
  395. iteration#: 347, loss: 23.85763270407915
  396. iteration#: 348, loss: 22.032596422359347
  397. iteration#: 349, loss: 22.971270080655813
  398. iteration#: 350, loss: 20.970440840814263
  399. iteration#: 351, loss: 23.434675257652998
  400. iteration#: 352, loss: 22.587820943444967
  401. iteration#: 353, loss: 23.733249640092254
  402. iteration#: 354, loss: 22.34412457048893
  403. iteration#: 355, loss: 23.633487179875374
  404. iteration#: 356, loss: 22.89092537946999
  405. iteration#: 357, loss: 22.90736585855484
  406. iteration#: 358, loss: 21.578827131539583
  407. iteration#: 359, loss: 22.56746038980782
  408. iteration#: 360, loss: 22.82948959618807
  409. iteration#: 361, loss: 22.76260907854885
  410. iteration#: 362, loss: 21.764938471838832
  411. iteration#: 363, loss: 22.301545649766922
  412. iteration#: 364, loss: 20.379208324477077
  413. iteration#: 365, loss: 21.50777213834226
  414. iteration#: 366, loss: 22.299156483262777
  415. iteration#: 367, loss: 23.41532531939447
  416. iteration#: 368, loss: 21.520611131563783
  417. iteration#: 369, loss: 22.176325000007637
  418. iteration#: 370, loss: 22.944751404225826
  419. iteration#: 371, loss: 23.069116791710258
  420. iteration#: 372, loss: 22.878415377810597
  421. iteration#: 373, loss: 22.71334956213832
  422. iteration#: 374, loss: 21.910293377470225
  423. iteration#: 375, loss: 22.1777365738526
  424. iteration#: 376, loss: 23.59987726714462
  425. iteration#: 377, loss: 22.85042697377503
  426. iteration#: 378, loss: 22.21125878766179
  427. iteration#: 379, loss: 21.230747547000647
  428. iteration#: 380, loss: 21.74002715945244
  429. iteration#: 381, loss: 21.82493080943823
  430. iteration#: 382, loss: 23.168578254058957
  431. iteration#: 383, loss: 22.883175514638424
  432. iteration#: 384, loss: 21.251771594397724
  433. iteration#: 385, loss: 21.9423017911613
  434. iteration#: 386, loss: 21.918760740489233
  435. iteration#: 387, loss: 22.617916559334844
  436. iteration#: 388, loss: 23.02409655181691
  437. iteration#: 389, loss: 22.426793751539662
  438. iteration#: 390, loss: 22.259402617812157
  439. iteration#: 391, loss: 21.917677057906985
  440. iteration#: 392, loss: 22.443641448393464
  441. iteration#: 393, loss: 21.6094628428109
  442. iteration#: 394, loss: 21.239283117000014
  443. iteration#: 395, loss: 21.214492505649105
  444. iteration#: 396, loss: 22.3387345764786
  445. iteration#: 397, loss: 22.484913914930075
  446. iteration#: 398, loss: 20.789034124463797
  447. iteration#: 399, loss: 20.889675866812468
  448. iteration#: 400, loss: 21.43484972976148
  449. validation:
  450. Raw setting:
  451. Hit@1 = 0.0
  452. Hit@3 = 0.24011680143755615
  453. Hit@10 = 0.4281221922731357
  454. MR = 1020.8772337027054
  455. MRR = 0.1556742550537393
  456.  
  457. Fil setting:
  458. Hit@1 = 0.3391484476390137
  459. Hit@3 = 0.45874513327343514
  460. Hit@10 = 0.5902465808126185
  461. MR = 994.0474942597584
  462. MRR = 0.42401144467347296
  463.  
  464. iteration#: 401, loss: 23.249696318991482
  465. iteration#: 402, loss: 21.19429254811257
  466. iteration#: 403, loss: 21.800255993381143
  467. iteration#: 404, loss: 23.13605154119432
  468. iteration#: 405, loss: 21.8390542101115
  469. iteration#: 406, loss: 23.006552616134286
  470. iteration#: 407, loss: 21.65937490016222
  471. iteration#: 408, loss: 20.897103879600763
  472. iteration#: 409, loss: 21.163311316631734
  473. iteration#: 410, loss: 21.761389765888453
  474. iteration#: 411, loss: 21.744938909076154
  475. iteration#: 412, loss: 21.104854705045
  476. iteration#: 413, loss: 22.478530399501324
  477. iteration#: 414, loss: 22.180987920612097
  478. iteration#: 415, loss: 22.460816520266235
  479. iteration#: 416, loss: 22.103346420452
  480. iteration#: 417, loss: 20.592110466212034
  481. iteration#: 418, loss: 21.77649979107082
  482. iteration#: 419, loss: 21.847989186644554
  483. iteration#: 420, loss: 22.98958497494459
  484. iteration#: 421, loss: 21.422638792544603
  485. iteration#: 422, loss: 22.31363247986883
  486. iteration#: 423, loss: 22.286662636324763
  487. iteration#: 424, loss: 21.947178239002824
  488. iteration#: 425, loss: 22.80321466177702
  489. iteration#: 426, loss: 22.49919576756656
  490. iteration#: 427, loss: 21.44191326573491
  491. iteration#: 428, loss: 21.49029839783907
  492. iteration#: 429, loss: 22.7541650403291
  493. iteration#: 430, loss: 20.59069173783064
  494. iteration#: 431, loss: 21.9158752579242
  495. iteration#: 432, loss: 21.397590935230255
  496. iteration#: 433, loss: 20.64666796848178
  497. iteration#: 434, loss: 21.702178256586194
  498. iteration#: 435, loss: 21.354624032974243
  499. iteration#: 436, loss: 21.242575827986002
  500. iteration#: 437, loss: 22.129924602806568
  501. iteration#: 438, loss: 21.949727043393068
  502. iteration#: 439, loss: 22.98925157636404
  503. iteration#: 440, loss: 21.90987526997924
  504. iteration#: 441, loss: 20.93463759869337
  505. iteration#: 442, loss: 21.083083886653185
  506. iteration#: 443, loss: 21.40187248773873
  507. iteration#: 444, loss: 20.83390999957919
  508. iteration#: 445, loss: 20.821795403957367
  509. iteration#: 446, loss: 21.842540256679058
  510. iteration#: 447, loss: 21.12655450310558
  511. iteration#: 448, loss: 20.486725889146328
  512. iteration#: 449, loss: 21.199680435471237
  513. iteration#: 450, loss: 21.91143242502585
  514. iteration#: 451, loss: 21.094152233563364
  515. iteration#: 452, loss: 21.61913809971884
  516. iteration#: 453, loss: 20.73567051347345
  517. iteration#: 454, loss: 21.349566344171762
  518. iteration#: 455, loss: 21.358126300387084
  519. iteration#: 456, loss: 21.32631280273199
  520. iteration#: 457, loss: 21.162315514869988
  521. iteration#: 458, loss: 20.479331642389297
  522. iteration#: 459, loss: 20.64435400441289
  523. iteration#: 460, loss: 20.058593570254743
  524. iteration#: 461, loss: 20.16186575172469
  525. iteration#: 462, loss: 20.42991684190929
  526. iteration#: 463, loss: 21.380917229689658
  527. iteration#: 464, loss: 21.952415805775672
  528. iteration#: 465, loss: 21.204546571709216
  529. iteration#: 466, loss: 21.117643024772406
  530. iteration#: 467, loss: 20.96833833679557
  531. iteration#: 468, loss: 21.598797030746937
  532. iteration#: 469, loss: 21.566693725064397
  533. iteration#: 470, loss: 21.01600386854261
  534. iteration#: 471, loss: 21.755610020831227
  535. iteration#: 472, loss: 20.88640418369323
  536. iteration#: 473, loss: 21.967098671011627
  537. iteration#: 474, loss: 22.052843501791358
  538. iteration#: 475, loss: 21.59802544489503
  539. iteration#: 476, loss: 21.497184867272153
  540. iteration#: 477, loss: 20.564411368221045
  541. iteration#: 478, loss: 21.266166856512427
  542. iteration#: 479, loss: 20.8535547144711
  543. iteration#: 480, loss: 21.277395579963923
  544. iteration#: 481, loss: 20.365421837195754
  545. iteration#: 482, loss: 21.97977838292718
  546. iteration#: 483, loss: 22.33593601733446
  547. iteration#: 484, loss: 19.664362984593026
  548. iteration#: 485, loss: 20.15722167585045
  549. iteration#: 486, loss: 19.909053412266076
  550. iteration#: 487, loss: 20.92108030617237
  551. iteration#: 488, loss: 21.175014791078866
  552. iteration#: 489, loss: 21.17112911399454
  553. iteration#: 490, loss: 20.74111422151327
  554. iteration#: 491, loss: 21.348958572372794
  555. iteration#: 492, loss: 20.988203558139503
  556. iteration#: 493, loss: 21.218574207276106
  557. iteration#: 494, loss: 20.588658027350903
  558. iteration#: 495, loss: 21.39802703820169
  559. iteration#: 496, loss: 21.00297266198322
  560. iteration#: 497, loss: 21.58903961442411
  561. iteration#: 498, loss: 20.839866237714887
  562. iteration#: 499, loss: 22.32119382917881
  563. iteration#: 500, loss: 21.06327011436224
  564. validation:
  565. Raw setting:
  566. Hit@1 = 0.0
  567. Hit@3 = 0.23891883797544175
  568. Hit@10 = 0.42904562244184885
  569. MR = 1026.381476489967
  570. MRR = 0.1556658803334297
  571.  
  572. Fil setting:
  573. Hit@1 = 0.3412698412698413
  574. Hit@3 = 0.46508435659379055
  575. Hit@10 = 0.5954127982429869
  576. MR = 999.2251422581611
  577. MRR = 0.42800969227445673
  578.  
  579. test in iteration 500:
  580. Raw setting:
  581. Hit@1 = 0.0
  582. Hit@3 = 0.3019202320097257
  583. Hit@10 = 0.5003442063480439
  584. MR = 843.954696585766
  585. MRR = 0.18622815834027878
  586.  
  587. Fil setting:
  588. Hit@1 = 0.3958519473291052
  589. Hit@3 = 0.5254346520586469
  590. Hit@10 = 0.6423329866858055
  591. MR = 818.7834722364624
  592. MRR = 0.4821113873262497
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement