Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Training the HypE model...
- Number of training data points: 61911
- Starting training...
- iteration#: 1, loss: 1128.5969721078873
- iteration#: 2, loss: 589.3206909894943
- iteration#: 3, loss: 352.0712777376175
- iteration#: 4, loss: 246.19432146847248
- iteration#: 5, loss: 202.48553057014942
- iteration#: 6, loss: 165.68770626187325
- iteration#: 7, loss: 145.24890618771315
- iteration#: 8, loss: 132.29567590355873
- iteration#: 9, loss: 122.3706817701459
- iteration#: 10, loss: 110.28342608362436
- iteration#: 11, loss: 103.30189225077629
- iteration#: 12, loss: 98.75038635358214
- iteration#: 13, loss: 93.63530520349741
- iteration#: 14, loss: 92.06604382395744
- iteration#: 15, loss: 83.64566880837083
- iteration#: 16, loss: 79.78967126086354
- iteration#: 17, loss: 77.55824789777398
- iteration#: 18, loss: 78.40112666413188
- iteration#: 19, loss: 76.36364080756903
- iteration#: 20, loss: 73.5182331353426
- iteration#: 21, loss: 72.82811096310616
- iteration#: 22, loss: 68.37139315903187
- iteration#: 23, loss: 67.71940031647682
- iteration#: 24, loss: 66.86391191929579
- iteration#: 25, loss: 66.00029823556542
- iteration#: 26, loss: 63.77029738947749
- iteration#: 27, loss: 62.39784202352166
- iteration#: 28, loss: 60.96891175955534
- iteration#: 29, loss: 61.461283415555954
- iteration#: 30, loss: 57.59357615932822
- iteration#: 31, loss: 56.80059587955475
- iteration#: 32, loss: 55.279016226530075
- iteration#: 33, loss: 53.79471708461642
- iteration#: 34, loss: 52.90129294246435
- iteration#: 35, loss: 55.00731261074543
- iteration#: 36, loss: 53.071109656244516
- iteration#: 37, loss: 53.67683732137084
- iteration#: 38, loss: 54.17810820043087
- iteration#: 39, loss: 49.99514580518007
- iteration#: 40, loss: 49.96600095927715
- iteration#: 41, loss: 49.71589883044362
- iteration#: 42, loss: 50.49418694153428
- iteration#: 43, loss: 47.33111986704171
- iteration#: 44, loss: 47.55682394653559
- iteration#: 45, loss: 47.433820594102144
- iteration#: 46, loss: 45.79797434806824
- iteration#: 47, loss: 45.78519377857447
- iteration#: 48, loss: 46.44497795775533
- iteration#: 49, loss: 43.37494556233287
- iteration#: 50, loss: 44.099109657108784
- iteration#: 51, loss: 45.47817849740386
- iteration#: 52, loss: 44.225241988897324
- iteration#: 53, loss: 42.102620895951986
- iteration#: 54, loss: 44.138226084411144
- iteration#: 55, loss: 41.6612194776535
- iteration#: 56, loss: 42.043484246358275
- iteration#: 57, loss: 41.97770299948752
- iteration#: 58, loss: 42.43217799440026
- iteration#: 59, loss: 40.212553314864635
- iteration#: 60, loss: 41.666214637458324
- iteration#: 61, loss: 40.84178464487195
- iteration#: 62, loss: 38.91881050914526
- iteration#: 63, loss: 39.607658829540014
- iteration#: 64, loss: 40.6425687558949
- iteration#: 65, loss: 39.766071401536465
- iteration#: 66, loss: 40.13560397922993
- iteration#: 67, loss: 38.51617764309049
- iteration#: 68, loss: 39.09016981907189
- iteration#: 69, loss: 37.40770496428013
- iteration#: 70, loss: 39.52282164245844
- iteration#: 71, loss: 37.689951818436384
- iteration#: 72, loss: 38.63494444359094
- iteration#: 73, loss: 35.92030730098486
- iteration#: 74, loss: 38.22083969414234
- iteration#: 75, loss: 36.69814841076732
- iteration#: 76, loss: 34.85340274870396
- iteration#: 77, loss: 37.02121592871845
- iteration#: 78, loss: 37.17175584100187
- iteration#: 79, loss: 34.77896904479712
- iteration#: 80, loss: 35.08971842750907
- iteration#: 81, loss: 36.15607845596969
- iteration#: 82, loss: 35.563974909484386
- iteration#: 83, loss: 37.93343322724104
- iteration#: 84, loss: 35.19352715462446
- iteration#: 85, loss: 34.01389496028423
- iteration#: 86, loss: 34.73526332899928
- iteration#: 87, loss: 35.384004667401314
- iteration#: 88, loss: 35.128996040672064
- iteration#: 89, loss: 35.14112612232566
- iteration#: 90, loss: 33.744155475869775
- iteration#: 91, loss: 34.359025252982974
- iteration#: 92, loss: 34.5803159289062
- iteration#: 93, loss: 34.566614117473364
- iteration#: 94, loss: 33.158173859119415
- iteration#: 95, loss: 34.567622024565935
- iteration#: 96, loss: 33.56074149161577
- iteration#: 97, loss: 33.440386816859245
- iteration#: 98, loss: 33.15666122082621
- iteration#: 99, loss: 33.35540076158941
- iteration#: 100, loss: 32.31053277011961
- validation:
- Raw setting:
- Hit@1 = 0.0
- Hit@3 = 0.2291604272736348
- Hit@10 = 0.405884995507637
- MR = 937.4299440950384
- MRR = 0.14917899870352636
- Fil setting:
- Hit@1 = 0.27979934112009586
- Hit@3 = 0.39061096136567836
- Hit@10 = 0.5278027353499052
- MR = 913.5135769192373
- MRR = 0.36328344097470155
- iteration#: 101, loss: 33.26325755007565
- iteration#: 102, loss: 34.22918685525656
- iteration#: 103, loss: 31.72119122557342
- iteration#: 104, loss: 34.25393953919411
- iteration#: 105, loss: 35.19353667087853
- iteration#: 106, loss: 31.300839794799685
- iteration#: 107, loss: 32.417203683406115
- iteration#: 108, loss: 32.92384012788534
- iteration#: 109, loss: 31.86550393141806
- iteration#: 110, loss: 31.662774812430143
- iteration#: 111, loss: 31.615627698600292
- iteration#: 112, loss: 30.160429138690233
- iteration#: 113, loss: 30.863088745623827
- iteration#: 114, loss: 31.117492131888866
- iteration#: 115, loss: 30.772704727947712
- iteration#: 116, loss: 30.377160154283047
- iteration#: 117, loss: 30.72225643042475
- iteration#: 118, loss: 31.225141864269972
- iteration#: 119, loss: 31.050309631973505
- iteration#: 120, loss: 31.19096688553691
- iteration#: 121, loss: 29.92217805981636
- iteration#: 122, loss: 30.261887602508068
- iteration#: 123, loss: 31.232261735014617
- iteration#: 124, loss: 29.986971170641482
- iteration#: 125, loss: 28.92918374761939
- iteration#: 126, loss: 29.307744782418013
- iteration#: 127, loss: 29.513292593881488
- iteration#: 128, loss: 30.21643714979291
- iteration#: 129, loss: 30.159386394545436
- iteration#: 130, loss: 29.226475175470114
- iteration#: 131, loss: 30.06484153866768
- iteration#: 132, loss: 30.340365203097463
- iteration#: 133, loss: 29.6525942645967
- iteration#: 134, loss: 28.415171306580305
- iteration#: 135, loss: 28.65402167290449
- iteration#: 136, loss: 30.443494034931064
- iteration#: 137, loss: 28.39379720017314
- iteration#: 138, loss: 31.505824368912727
- iteration#: 139, loss: 31.484795162454247
- iteration#: 140, loss: 28.738601291552186
- iteration#: 141, loss: 30.208784237504005
- iteration#: 142, loss: 28.228824887424707
- iteration#: 143, loss: 28.55437532067299
- iteration#: 144, loss: 28.28537051472813
- iteration#: 145, loss: 27.25414576753974
- iteration#: 146, loss: 29.058412563055754
- iteration#: 147, loss: 28.984738670289516
- iteration#: 148, loss: 28.05106611084193
- iteration#: 149, loss: 28.65531910955906
- iteration#: 150, loss: 27.327295269817114
- iteration#: 151, loss: 28.951372979208827
- iteration#: 152, loss: 28.768770076334476
- iteration#: 153, loss: 26.743937350809574
- iteration#: 154, loss: 27.564233478158712
- iteration#: 155, loss: 27.864890094846487
- iteration#: 156, loss: 30.39787882193923
- iteration#: 157, loss: 28.431658174842596
- iteration#: 158, loss: 27.678705608472228
- iteration#: 159, loss: 27.66184150055051
- iteration#: 160, loss: 26.19655105425045
- iteration#: 161, loss: 28.142703719437122
- iteration#: 162, loss: 27.079211939126253
- iteration#: 163, loss: 30.366659447550774
- iteration#: 164, loss: 27.64105688035488
- iteration#: 165, loss: 27.564809288363904
- iteration#: 166, loss: 26.746821008622646
- iteration#: 167, loss: 28.730659483000636
- iteration#: 168, loss: 27.27083331719041
- iteration#: 169, loss: 26.759906120598316
- iteration#: 170, loss: 26.359628465026617
- iteration#: 171, loss: 26.729565834626555
- iteration#: 172, loss: 27.446469645947218
- iteration#: 173, loss: 27.38310758769512
- iteration#: 174, loss: 27.526637017726898
- iteration#: 175, loss: 26.98622157610953
- iteration#: 176, loss: 26.761791734956205
- iteration#: 177, loss: 27.418844245374203
- iteration#: 178, loss: 26.305560232140124
- iteration#: 179, loss: 27.326188180595636
- iteration#: 180, loss: 26.088891372084618
- iteration#: 181, loss: 26.502641008235514
- iteration#: 182, loss: 26.568427070975304
- iteration#: 183, loss: 26.10986440628767
- iteration#: 184, loss: 27.35618031769991
- iteration#: 185, loss: 26.71378854289651
- iteration#: 186, loss: 26.792690694332123
- iteration#: 187, loss: 26.765573816373944
- iteration#: 188, loss: 27.189598760567605
- iteration#: 189, loss: 27.012643849477172
- iteration#: 190, loss: 25.817803516983986
- iteration#: 191, loss: 25.85440818965435
- iteration#: 192, loss: 25.07075311988592
- iteration#: 193, loss: 23.886949023231864
- iteration#: 194, loss: 26.634819438681006
- iteration#: 195, loss: 25.313688285648823
- iteration#: 196, loss: 26.744353350251913
- iteration#: 197, loss: 26.14979605935514
- iteration#: 198, loss: 26.016125928144902
- iteration#: 199, loss: 25.724481565877795
- iteration#: 200, loss: 25.14638690277934
- validation:
- Raw setting:
- Hit@1 = 0.0
- Hit@3 = 0.23649795347908556
- Hit@10 = 0.42041030248577416
- MR = 1000.9781371668164
- MRR = 0.1539652263499597
- Fil setting:
- Hit@1 = 0.3131676150544075
- Hit@3 = 0.4312418887890586
- Hit@10 = 0.5641160027952481
- MR = 976.2180043925327
- MRR = 0.3979868198955892
- iteration#: 201, loss: 27.24712151940912
- iteration#: 202, loss: 24.7819806933403
- iteration#: 203, loss: 26.716864474117756
- iteration#: 204, loss: 24.220598552376032
- iteration#: 205, loss: 25.058776330202818
- iteration#: 206, loss: 24.889723427593708
- iteration#: 207, loss: 25.093551199883223
- iteration#: 208, loss: 25.470895286649466
- iteration#: 209, loss: 25.849243696779013
- iteration#: 210, loss: 24.411983223631978
- iteration#: 211, loss: 25.2604296291247
- iteration#: 212, loss: 25.062909573316574
- iteration#: 213, loss: 25.689380656927824
- iteration#: 214, loss: 24.776960257440805
- iteration#: 215, loss: 24.7105468865484
- iteration#: 216, loss: 25.115280451253057
- iteration#: 217, loss: 25.003646360710263
- iteration#: 218, loss: 24.386205823160708
- iteration#: 219, loss: 25.139376003295183
- iteration#: 220, loss: 26.026214078068733
- iteration#: 221, loss: 24.418052868917584
- iteration#: 222, loss: 25.533744409680367
- iteration#: 223, loss: 26.67623369395733
- iteration#: 224, loss: 25.56923023238778
- iteration#: 225, loss: 24.79012242332101
- iteration#: 226, loss: 24.617675449699163
- iteration#: 227, loss: 24.43231729604304
- iteration#: 228, loss: 24.549676839262247
- iteration#: 229, loss: 26.06411285791546
- iteration#: 230, loss: 24.40025589382276
- iteration#: 231, loss: 25.245934923179448
- iteration#: 232, loss: 25.283985276240855
- iteration#: 233, loss: 24.336508235894144
- iteration#: 234, loss: 24.490268025081605
- iteration#: 235, loss: 26.47953055659309
- iteration#: 236, loss: 25.421675144694746
- iteration#: 237, loss: 24.01611863076687
- iteration#: 238, loss: 24.482236433774233
- iteration#: 239, loss: 24.822535067796707
- iteration#: 240, loss: 24.617661265656352
- iteration#: 241, loss: 25.402254071086645
- iteration#: 242, loss: 25.182959680445492
- iteration#: 243, loss: 24.926196499727666
- iteration#: 244, loss: 24.1664194855839
- iteration#: 245, loss: 25.3267030864954
- iteration#: 246, loss: 25.069063547067344
- iteration#: 247, loss: 23.23500164039433
- iteration#: 248, loss: 24.93844872713089
- iteration#: 249, loss: 24.657083835452795
- iteration#: 250, loss: 24.02684534341097
- iteration#: 251, loss: 24.74254036694765
- iteration#: 252, loss: 23.778330402448773
- iteration#: 253, loss: 24.912817834876478
- iteration#: 254, loss: 25.329655833542347
- iteration#: 255, loss: 25.94052093103528
- iteration#: 256, loss: 24.734104819595814
- iteration#: 257, loss: 26.226534105837345
- iteration#: 258, loss: 23.75828493386507
- iteration#: 259, loss: 24.486495088785887
- iteration#: 260, loss: 24.302721105515957
- iteration#: 261, loss: 25.560199317522347
- iteration#: 262, loss: 25.861542949220166
- iteration#: 263, loss: 25.19694292731583
- iteration#: 264, loss: 24.488210739567876
- iteration#: 265, loss: 25.264429111033678
- iteration#: 266, loss: 24.15079783136025
- iteration#: 267, loss: 24.848060972988605
- iteration#: 268, loss: 24.85860359482467
- iteration#: 269, loss: 25.26806028187275
- iteration#: 270, loss: 24.723592843860388
- iteration#: 271, loss: 24.9519777931273
- iteration#: 272, loss: 24.12641540542245
- iteration#: 273, loss: 25.345917662605643
- iteration#: 274, loss: 24.698667232878506
- iteration#: 275, loss: 24.103543657809496
- iteration#: 276, loss: 24.678639722056687
- iteration#: 277, loss: 23.85308739542961
- iteration#: 278, loss: 25.21232662163675
- iteration#: 279, loss: 23.68872652295977
- iteration#: 280, loss: 24.280228855088353
- iteration#: 281, loss: 23.895063892006874
- iteration#: 282, loss: 24.12961891386658
- iteration#: 283, loss: 24.267005558591336
- iteration#: 284, loss: 24.854911517351866
- iteration#: 285, loss: 24.724048418924212
- iteration#: 286, loss: 24.230520529672503
- iteration#: 287, loss: 24.04946636594832
- iteration#: 288, loss: 24.340305379591882
- iteration#: 289, loss: 22.771409523207694
- iteration#: 290, loss: 22.645989569835365
- iteration#: 291, loss: 24.190760102123022
- iteration#: 292, loss: 24.118768325075507
- iteration#: 293, loss: 22.914921359624714
- iteration#: 294, loss: 25.218876730650663
- iteration#: 295, loss: 23.376327358186245
- iteration#: 296, loss: 23.65202672034502
- iteration#: 297, loss: 23.588180601596832
- iteration#: 298, loss: 23.46082061715424
- iteration#: 299, loss: 22.512943250127137
- iteration#: 300, loss: 24.99040351808071
- validation:
- Raw setting:
- Hit@1 = 0.0
- Hit@3 = 0.23851951682140363
- Hit@10 = 0.4250773684735949
- MR = 999.751547369472
- MRR = 0.15537151787075468
- Fil setting:
- Hit@1 = 0.3315613457122891
- Hit@3 = 0.45130777677947487
- Hit@10 = 0.5833333333333334
- MR = 973.5657132874114
- MRR = 0.416801066038361
- iteration#: 301, loss: 23.46126427873969
- iteration#: 302, loss: 23.080572795122862
- iteration#: 303, loss: 23.627124927937984
- iteration#: 304, loss: 22.848930288106203
- iteration#: 305, loss: 22.69973354972899
- iteration#: 306, loss: 23.34152278956026
- iteration#: 307, loss: 22.907369073480368
- iteration#: 308, loss: 22.784663654863834
- iteration#: 309, loss: 24.20392457768321
- iteration#: 310, loss: 24.275926711969078
- iteration#: 311, loss: 23.126015905290842
- iteration#: 312, loss: 24.156770821660757
- iteration#: 313, loss: 22.9060525521636
- iteration#: 314, loss: 23.94069342687726
- iteration#: 315, loss: 22.368323352187872
- iteration#: 316, loss: 22.767602367326617
- iteration#: 317, loss: 23.37242553755641
- iteration#: 318, loss: 24.176541523076594
- iteration#: 319, loss: 23.689086553640664
- iteration#: 320, loss: 23.148982455953956
- iteration#: 321, loss: 22.40683679538779
- iteration#: 322, loss: 23.597058003302664
- iteration#: 323, loss: 23.193390090949833
- iteration#: 324, loss: 23.11877152696252
- iteration#: 325, loss: 21.720063833519816
- iteration#: 326, loss: 23.821586064994335
- iteration#: 327, loss: 22.570988729130477
- iteration#: 328, loss: 23.12326236255467
- iteration#: 329, loss: 23.54500874504447
- iteration#: 330, loss: 22.800207087770104
- iteration#: 331, loss: 22.296832972206175
- iteration#: 332, loss: 22.781476109288633
- iteration#: 333, loss: 22.800559541210532
- iteration#: 334, loss: 22.658214198425412
- iteration#: 335, loss: 23.009371127933264
- iteration#: 336, loss: 22.986757159233093
- iteration#: 337, loss: 23.880425991490483
- iteration#: 338, loss: 22.787995271384716
- iteration#: 339, loss: 23.881447231397033
- iteration#: 340, loss: 22.433480247855186
- iteration#: 341, loss: 22.915081072831526
- iteration#: 342, loss: 22.919296022038907
- iteration#: 343, loss: 23.584480277262628
- iteration#: 344, loss: 23.08296275511384
- iteration#: 345, loss: 22.462437914684415
- iteration#: 346, loss: 21.69634547457099
- iteration#: 347, loss: 23.85763270407915
- iteration#: 348, loss: 22.032596422359347
- iteration#: 349, loss: 22.971270080655813
- iteration#: 350, loss: 20.970440840814263
- iteration#: 351, loss: 23.434675257652998
- iteration#: 352, loss: 22.587820943444967
- iteration#: 353, loss: 23.733249640092254
- iteration#: 354, loss: 22.34412457048893
- iteration#: 355, loss: 23.633487179875374
- iteration#: 356, loss: 22.89092537946999
- iteration#: 357, loss: 22.90736585855484
- iteration#: 358, loss: 21.578827131539583
- iteration#: 359, loss: 22.56746038980782
- iteration#: 360, loss: 22.82948959618807
- iteration#: 361, loss: 22.76260907854885
- iteration#: 362, loss: 21.764938471838832
- iteration#: 363, loss: 22.301545649766922
- iteration#: 364, loss: 20.379208324477077
- iteration#: 365, loss: 21.50777213834226
- iteration#: 366, loss: 22.299156483262777
- iteration#: 367, loss: 23.41532531939447
- iteration#: 368, loss: 21.520611131563783
- iteration#: 369, loss: 22.176325000007637
- iteration#: 370, loss: 22.944751404225826
- iteration#: 371, loss: 23.069116791710258
- iteration#: 372, loss: 22.878415377810597
- iteration#: 373, loss: 22.71334956213832
- iteration#: 374, loss: 21.910293377470225
- iteration#: 375, loss: 22.1777365738526
- iteration#: 376, loss: 23.59987726714462
- iteration#: 377, loss: 22.85042697377503
- iteration#: 378, loss: 22.21125878766179
- iteration#: 379, loss: 21.230747547000647
- iteration#: 380, loss: 21.74002715945244
- iteration#: 381, loss: 21.82493080943823
- iteration#: 382, loss: 23.168578254058957
- iteration#: 383, loss: 22.883175514638424
- iteration#: 384, loss: 21.251771594397724
- iteration#: 385, loss: 21.9423017911613
- iteration#: 386, loss: 21.918760740489233
- iteration#: 387, loss: 22.617916559334844
- iteration#: 388, loss: 23.02409655181691
- iteration#: 389, loss: 22.426793751539662
- iteration#: 390, loss: 22.259402617812157
- iteration#: 391, loss: 21.917677057906985
- iteration#: 392, loss: 22.443641448393464
- iteration#: 393, loss: 21.6094628428109
- iteration#: 394, loss: 21.239283117000014
- iteration#: 395, loss: 21.214492505649105
- iteration#: 396, loss: 22.3387345764786
- iteration#: 397, loss: 22.484913914930075
- iteration#: 398, loss: 20.789034124463797
- iteration#: 399, loss: 20.889675866812468
- iteration#: 400, loss: 21.43484972976148
- validation:
- Raw setting:
- Hit@1 = 0.0
- Hit@3 = 0.24011680143755615
- Hit@10 = 0.4281221922731357
- MR = 1020.8772337027054
- MRR = 0.1556742550537393
- Fil setting:
- Hit@1 = 0.3391484476390137
- Hit@3 = 0.45874513327343514
- Hit@10 = 0.5902465808126185
- MR = 994.0474942597584
- MRR = 0.42401144467347296
- iteration#: 401, loss: 23.249696318991482
- iteration#: 402, loss: 21.19429254811257
- iteration#: 403, loss: 21.800255993381143
- iteration#: 404, loss: 23.13605154119432
- iteration#: 405, loss: 21.8390542101115
- iteration#: 406, loss: 23.006552616134286
- iteration#: 407, loss: 21.65937490016222
- iteration#: 408, loss: 20.897103879600763
- iteration#: 409, loss: 21.163311316631734
- iteration#: 410, loss: 21.761389765888453
- iteration#: 411, loss: 21.744938909076154
- iteration#: 412, loss: 21.104854705045
- iteration#: 413, loss: 22.478530399501324
- iteration#: 414, loss: 22.180987920612097
- iteration#: 415, loss: 22.460816520266235
- iteration#: 416, loss: 22.103346420452
- iteration#: 417, loss: 20.592110466212034
- iteration#: 418, loss: 21.77649979107082
- iteration#: 419, loss: 21.847989186644554
- iteration#: 420, loss: 22.98958497494459
- iteration#: 421, loss: 21.422638792544603
- iteration#: 422, loss: 22.31363247986883
- iteration#: 423, loss: 22.286662636324763
- iteration#: 424, loss: 21.947178239002824
- iteration#: 425, loss: 22.80321466177702
- iteration#: 426, loss: 22.49919576756656
- iteration#: 427, loss: 21.44191326573491
- iteration#: 428, loss: 21.49029839783907
- iteration#: 429, loss: 22.7541650403291
- iteration#: 430, loss: 20.59069173783064
- iteration#: 431, loss: 21.9158752579242
- iteration#: 432, loss: 21.397590935230255
- iteration#: 433, loss: 20.64666796848178
- iteration#: 434, loss: 21.702178256586194
- iteration#: 435, loss: 21.354624032974243
- iteration#: 436, loss: 21.242575827986002
- iteration#: 437, loss: 22.129924602806568
- iteration#: 438, loss: 21.949727043393068
- iteration#: 439, loss: 22.98925157636404
- iteration#: 440, loss: 21.90987526997924
- iteration#: 441, loss: 20.93463759869337
- iteration#: 442, loss: 21.083083886653185
- iteration#: 443, loss: 21.40187248773873
- iteration#: 444, loss: 20.83390999957919
- iteration#: 445, loss: 20.821795403957367
- iteration#: 446, loss: 21.842540256679058
- iteration#: 447, loss: 21.12655450310558
- iteration#: 448, loss: 20.486725889146328
- iteration#: 449, loss: 21.199680435471237
- iteration#: 450, loss: 21.91143242502585
- iteration#: 451, loss: 21.094152233563364
- iteration#: 452, loss: 21.61913809971884
- iteration#: 453, loss: 20.73567051347345
- iteration#: 454, loss: 21.349566344171762
- iteration#: 455, loss: 21.358126300387084
- iteration#: 456, loss: 21.32631280273199
- iteration#: 457, loss: 21.162315514869988
- iteration#: 458, loss: 20.479331642389297
- iteration#: 459, loss: 20.64435400441289
- iteration#: 460, loss: 20.058593570254743
- iteration#: 461, loss: 20.16186575172469
- iteration#: 462, loss: 20.42991684190929
- iteration#: 463, loss: 21.380917229689658
- iteration#: 464, loss: 21.952415805775672
- iteration#: 465, loss: 21.204546571709216
- iteration#: 466, loss: 21.117643024772406
- iteration#: 467, loss: 20.96833833679557
- iteration#: 468, loss: 21.598797030746937
- iteration#: 469, loss: 21.566693725064397
- iteration#: 470, loss: 21.01600386854261
- iteration#: 471, loss: 21.755610020831227
- iteration#: 472, loss: 20.88640418369323
- iteration#: 473, loss: 21.967098671011627
- iteration#: 474, loss: 22.052843501791358
- iteration#: 475, loss: 21.59802544489503
- iteration#: 476, loss: 21.497184867272153
- iteration#: 477, loss: 20.564411368221045
- iteration#: 478, loss: 21.266166856512427
- iteration#: 479, loss: 20.8535547144711
- iteration#: 480, loss: 21.277395579963923
- iteration#: 481, loss: 20.365421837195754
- iteration#: 482, loss: 21.97977838292718
- iteration#: 483, loss: 22.33593601733446
- iteration#: 484, loss: 19.664362984593026
- iteration#: 485, loss: 20.15722167585045
- iteration#: 486, loss: 19.909053412266076
- iteration#: 487, loss: 20.92108030617237
- iteration#: 488, loss: 21.175014791078866
- iteration#: 489, loss: 21.17112911399454
- iteration#: 490, loss: 20.74111422151327
- iteration#: 491, loss: 21.348958572372794
- iteration#: 492, loss: 20.988203558139503
- iteration#: 493, loss: 21.218574207276106
- iteration#: 494, loss: 20.588658027350903
- iteration#: 495, loss: 21.39802703820169
- iteration#: 496, loss: 21.00297266198322
- iteration#: 497, loss: 21.58903961442411
- iteration#: 498, loss: 20.839866237714887
- iteration#: 499, loss: 22.32119382917881
- iteration#: 500, loss: 21.06327011436224
- validation:
- Raw setting:
- Hit@1 = 0.0
- Hit@3 = 0.23891883797544175
- Hit@10 = 0.42904562244184885
- MR = 1026.381476489967
- MRR = 0.1556658803334297
- Fil setting:
- Hit@1 = 0.3412698412698413
- Hit@3 = 0.46508435659379055
- Hit@10 = 0.5954127982429869
- MR = 999.2251422581611
- MRR = 0.42800969227445673
- test in iteration 500:
- Raw setting:
- Hit@1 = 0.0
- Hit@3 = 0.3019202320097257
- Hit@10 = 0.5003442063480439
- MR = 843.954696585766
- MRR = 0.18622815834027878
- Fil setting:
- Hit@1 = 0.3958519473291052
- Hit@3 = 0.5254346520586469
- Hit@10 = 0.6423329866858055
- MR = 818.7834722364624
- MRR = 0.4821113873262497
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement