Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Released code reproduce result with default parameters lower than publish one #21

Closed
geyutang opened this issue Sep 5, 2019 · 4 comments

Comments

@geyutang
Copy link

geyutang commented Sep 5, 2019

Thanks for your excellent work and kindly code release. This work is elegant and inspires my future study.
However, when I run your release code with default parameter on the Market dataset, the Rank1 and MAP is slightly lower than the published one. The Rank1 is 65.2 (67.7 in the paper) and MAP is 38.8 (40.0 in the paper) when the model converged.

  • Can you reproduce the result in the paper using this release code?
  • Does the parameter need to be fine-tuned slightly base on this release code?

Any suggestions for this mismatch? Thanks for your kindly reply.

@KovenYu
Copy link
Owner

KovenYu commented Sep 5, 2019

Yes I've tried a few times before I released code. My results were quite closed to the published results with an approximate 0.2% std. There might be some very implicit gap between our running environments that led to the difference. The following is a log that you might compare to yours, especially for the loss values:

python version : 3.6.4 |Anaconda, Inc.| (default, Jan 16 2018, 18:10:19) [GCC 7.2.0]
torch version : 1.0.0

------------------------------------------------------- options --------------------------------------------------------
batch_size: 368 beta: 0.2 crop_size: (384, 128)
epochs: 20 gpu: 0,1,2,3 img_size: (384, 128)
lamb_1: 0.0002 lamb_2: 50.0 lr: 0.0002
margin: 1.0 mining_ratio: 0.005 ml_path: data/ml_Market.dat
padding: 7 pretrain_path: data/pretrained.pth print_freq: 100
resume: save_path: runs/debug scala_ce: 30.0
source: MSMT17 target: Market wd: 0.025

loaded pre-trained model from data/pretrained.pth

==>>[2019-03-22 16:14:14] [Epoch=000/020] Stage 1, [Need: 00:00:00]
initializing centers/threshold ...
not found data/ml_Market.dat. computing ml...
saving computed ml to data/ml_Market.dat
initializing centers done.
initializing threshold done.
Iter: [000/674] Freq 45.0 loss_source 0.069 loss_st 0.403 loss_ml 9594.455 loss_target 0.000 loss_total 9.394 [2019-03-22 16:43:09]
Iter: [100/674] Freq 402.4 loss_source 0.048 loss_st 0.457 loss_ml 3221.372 loss_target 0.000 loss_total 7.613 [2019-03-22 16:44:33]
Iter: [200/674] Freq 423.9 loss_source 0.043 loss_st 0.471 loss_ml 2578.133 loss_target 0.000 loss_total 7.374 [2019-03-22 16:45:55]
Iter: [300/674] Freq 428.1 loss_source 0.041 loss_st 0.477 loss_ml 2297.286 loss_target 0.000 loss_total 7.289 [2019-03-22 16:47:19]
Iter: [400/674] Freq 434.3 loss_source 0.041 loss_st 0.483 loss_ml 2138.504 loss_target 0.000 loss_total 7.293 [2019-03-22 16:48:40]
Iter: [500/674] Freq 435.4 loss_source 0.040 loss_st 0.486 loss_ml 2031.008 loss_target 0.000 loss_total 7.272 [2019-03-22 16:50:04]
Iter: [600/674] Freq 438.3 loss_source 0.040 loss_st 0.488 loss_ml 1950.938 loss_target 0.000 loss_total 7.279 [2019-03-22 16:51:25]
Train loss_source 0.040 loss_st 0.490 loss_ml 1903.796 loss_target 0.000 loss_total 7.271

==>>[2019-03-22 16:52:41] [Epoch=001/020] Stage 1, [Need: 12:10:30]
Iter: [000/674] Freq 95.4 loss_source 0.044 loss_st 0.506 loss_ml 1460.446 loss_target 0.679 loss_total 8.239 [2019-03-22 16:52:45]
Iter: [100/674] Freq 254.8 loss_source 0.025 loss_st 0.492 loss_ml 1475.508 loss_target 0.625 loss_total 7.091 [2019-03-22 16:55:07]
Iter: [200/674] Freq 256.1 loss_source 0.026 loss_st 0.491 loss_ml 1473.360 loss_target 0.626 loss_total 7.125 [2019-03-22 16:57:30]
Iter: [300/674] Freq 255.0 loss_source 0.027 loss_st 0.492 loss_ml 1466.532 loss_target 0.625 loss_total 7.201 [2019-03-22 16:59:56]
Iter: [400/674] Freq 256.1 loss_source 0.029 loss_st 0.491 loss_ml 1458.591 loss_target 0.626 loss_total 7.259 [2019-03-22 17:02:17]
Iter: [500/674] Freq 256.0 loss_source 0.030 loss_st 0.491 loss_ml 1453.664 loss_target 0.626 loss_total 7.310 [2019-03-22 17:04:42]
Iter: [600/674] Freq 256.2 loss_source 0.030 loss_st 0.491 loss_ml 1447.688 loss_target 0.625 loss_total 7.335 [2019-03-22 17:07:04]
Train loss_source 0.030 loss_st 0.491 loss_ml 1440.638 loss_target 0.625 loss_total 7.337

==>>[2019-03-22 17:09:08] [Epoch=002/020] Stage 1, [Need: 08:14:03]
Iter: [000/674] Freq 104.2 loss_source 0.049 loss_st 0.521 loss_ml 1479.117 loss_target 0.609 loss_total 8.557 [2019-03-22 17:09:12]
Iter: [100/674] Freq 253.8 loss_source 0.025 loss_st 0.487 loss_ml 1392.113 loss_target 0.623 loss_total 7.034 [2019-03-22 17:11:34]
Iter: [200/674] Freq 256.1 loss_source 0.024 loss_st 0.486 loss_ml 1374.913 loss_target 0.623 loss_total 6.985 [2019-03-22 17:13:57]
Iter: [300/674] Freq 255.3 loss_source 0.026 loss_st 0.486 loss_ml 1372.223 loss_target 0.624 loss_total 7.057 [2019-03-22 17:16:22]
Iter: [400/674] Freq 256.1 loss_source 0.027 loss_st 0.484 loss_ml 1371.808 loss_target 0.624 loss_total 7.098 [2019-03-22 17:18:44]
Iter: [500/674] Freq 255.8 loss_source 0.028 loss_st 0.483 loss_ml 1371.981 loss_target 0.624 loss_total 7.153 [2019-03-22 17:21:09]
Iter: [600/674] Freq 256.2 loss_source 0.029 loss_st 0.483 loss_ml 1367.401 loss_target 0.623 loss_total 7.154 [2019-03-22 17:23:31]
Train loss_source 0.029 loss_st 0.483 loss_ml 1362.591 loss_target 0.623 loss_total 7.150

==>>[2019-03-22 17:25:34] [Epoch=003/020] Stage 1, [Need: 06:44:10]
Iter: [000/674] Freq 97.9 loss_source 0.008 loss_st 0.461 loss_ml 1777.439 loss_target 0.628 loss_total 5.978 [2019-03-22 17:25:38]
Iter: [100/674] Freq 254.7 loss_source 0.023 loss_st 0.474 loss_ml 1344.012 loss_target 0.621 loss_total 6.802 [2019-03-22 17:28:00]
Iter: [200/674] Freq 256.4 loss_source 0.023 loss_st 0.473 loss_ml 1339.830 loss_target 0.622 loss_total 6.752 [2019-03-22 17:30:22]
Iter: [300/674] Freq 256.0 loss_source 0.023 loss_st 0.474 loss_ml 1329.876 loss_target 0.622 loss_total 6.785 [2019-03-22 17:32:47]
Iter: [400/674] Freq 256.6 loss_source 0.024 loss_st 0.474 loss_ml 1328.106 loss_target 0.622 loss_total 6.845 [2019-03-22 17:35:09]
Iter: [500/674] Freq 256.4 loss_source 0.026 loss_st 0.473 loss_ml 1325.383 loss_target 0.622 loss_total 6.900 [2019-03-22 17:37:33]
Iter: [600/674] Freq 256.9 loss_source 0.027 loss_st 0.474 loss_ml 1322.033 loss_target 0.622 loss_total 6.955 [2019-03-22 17:39:55]
Train loss_source 0.027 loss_st 0.474 loss_ml 1320.304 loss_target 0.622 loss_total 6.975

==>>[2019-03-22 17:42:00] [Epoch=004/020] Stage 1, [Need: 05:51:04]
Iter: [000/674] Freq 97.6 loss_source 0.013 loss_st 0.490 loss_ml 1116.747 loss_target 0.623 loss_total 6.375 [2019-03-22 17:42:04]
Iter: [100/674] Freq 255.6 loss_source 0.021 loss_st 0.470 loss_ml 1299.623 loss_target 0.621 loss_total 6.615 [2019-03-22 17:44:26]
Iter: [200/674] Freq 257.2 loss_source 0.023 loss_st 0.468 loss_ml 1299.776 loss_target 0.620 loss_total 6.696 [2019-03-22 17:46:48]
Iter: [300/674] Freq 256.5 loss_source 0.024 loss_st 0.467 loss_ml 1299.387 loss_target 0.619 loss_total 6.746 [2019-03-22 17:49:12]
Iter: [400/674] Freq 257.2 loss_source 0.025 loss_st 0.467 loss_ml 1292.654 loss_target 0.619 loss_total 6.788 [2019-03-22 17:51:34]
Iter: [500/674] Freq 257.0 loss_source 0.025 loss_st 0.467 loss_ml 1293.955 loss_target 0.619 loss_total 6.802 [2019-03-22 17:53:58]
Iter: [600/674] Freq 257.3 loss_source 0.025 loss_st 0.467 loss_ml 1295.423 loss_target 0.619 loss_total 6.813 [2019-03-22 17:56:20]
Train loss_source 0.026 loss_st 0.467 loss_ml 1292.915 loss_target 0.619 loss_total 6.845

==>>[2019-03-22 17:58:18] [Epoch=005/020] Stage 1, [Need: 05:12:10]
Iter: [000/674] Freq 97.4 loss_source 0.004 loss_st 0.464 loss_ml 1305.863 loss_target 0.624 loss_total 5.737 [2019-03-22 17:58:21]
Iter: [100/674] Freq 254.1 loss_source 0.021 loss_st 0.460 loss_ml 1292.725 loss_target 0.618 loss_total 6.530 [2019-03-22 18:00:44]
Iter: [200/674] Freq 256.2 loss_source 0.023 loss_st 0.461 loss_ml 1276.015 loss_target 0.618 loss_total 6.626 [2019-03-22 18:03:06]
Iter: [300/674] Freq 255.2 loss_source 0.022 loss_st 0.460 loss_ml 1273.454 loss_target 0.618 loss_total 6.597 [2019-03-22 18:05:32]
Iter: [400/674] Freq 256.2 loss_source 0.023 loss_st 0.460 loss_ml 1275.027 loss_target 0.617 loss_total 6.640 [2019-03-22 18:07:54]
Iter: [500/674] Freq 255.8 loss_source 0.025 loss_st 0.461 loss_ml 1270.431 loss_target 0.618 loss_total 6.735 [2019-03-22 18:10:18]
Iter: [600/674] Freq 256.2 loss_source 0.026 loss_st 0.461 loss_ml 1264.849 loss_target 0.618 loss_total 6.785 [2019-03-22 18:12:41]
Train loss_source 0.027 loss_st 0.461 loss_ml 1263.870 loss_target 0.617 loss_total 6.814

==>>[2019-03-22 18:14:47] [Epoch=006/020] Stage 1, [Need: 04:41:16]
Iter: [000/674] Freq 107.0 loss_source 0.005 loss_st 0.440 loss_ml 1427.579 loss_target 0.602 loss_total 5.523 [2019-03-22 18:14:50]
Iter: [100/674] Freq 256.7 loss_source 0.022 loss_st 0.459 loss_ml 1248.004 loss_target 0.617 loss_total 6.548 [2019-03-22 18:17:12]
Iter: [200/674] Freq 257.7 loss_source 0.021 loss_st 0.457 loss_ml 1255.881 loss_target 0.616 loss_total 6.506 [2019-03-22 18:19:34]
Iter: [300/674] Freq 256.9 loss_source 0.022 loss_st 0.456 loss_ml 1247.881 loss_target 0.615 loss_total 6.528 [2019-03-22 18:21:58]
Iter: [400/674] Freq 258.5 loss_source 0.023 loss_st 0.457 loss_ml 1241.385 loss_target 0.615 loss_total 6.576 [2019-03-22 18:24:18]
Iter: [500/674] Freq 257.3 loss_source 0.025 loss_st 0.456 loss_ml 1243.680 loss_target 0.615 loss_total 6.657 [2019-03-22 18:26:44]
Iter: [600/674] Freq 257.7 loss_source 0.025 loss_st 0.456 loss_ml 1244.820 loss_target 0.614 loss_total 6.693 [2019-03-22 18:29:05]
Train loss_source 0.026 loss_st 0.456 loss_ml 1246.823 loss_target 0.614 loss_total 6.738

==>>[2019-03-22 18:31:23] [Epoch=007/020] Stage 1, [Need: 04:14:42]
Iter: [000/674] Freq 101.6 loss_source 0.058 loss_st 0.437 loss_ml 1284.939 loss_target 0.622 loss_total 8.176 [2019-03-22 18:31:27]
Iter: [100/674] Freq 254.1 loss_source 0.023 loss_st 0.450 loss_ml 1248.253 loss_target 0.616 loss_total 6.511 [2019-03-22 18:33:49]
Iter: [200/674] Freq 257.0 loss_source 0.023 loss_st 0.450 loss_ml 1238.636 loss_target 0.614 loss_total 6.493 [2019-03-22 18:36:11]
Iter: [300/674] Freq 256.3 loss_source 0.022 loss_st 0.450 loss_ml 1241.050 loss_target 0.613 loss_total 6.461 [2019-03-22 18:38:35]
Iter: [400/674] Freq 256.9 loss_source 0.023 loss_st 0.450 loss_ml 1240.262 loss_target 0.612 loss_total 6.483 [2019-03-22 18:40:58]
Iter: [500/674] Freq 256.3 loss_source 0.024 loss_st 0.450 loss_ml 1244.173 loss_target 0.612 loss_total 6.540 [2019-03-22 18:43:23]
Iter: [600/674] Freq 257.0 loss_source 0.025 loss_st 0.450 loss_ml 1244.939 loss_target 0.612 loss_total 6.600 [2019-03-22 18:45:44]
Train loss_source 0.025 loss_st 0.451 loss_ml 1244.053 loss_target 0.612 loss_total 6.614

==>>[2019-03-22 18:47:59] [Epoch=008/020] Stage 1, [Need: 03:50:36]
Iter: [000/674] Freq 107.9 loss_source 0.062 loss_st 0.450 loss_ml 1375.609 loss_target 0.604 loss_total 8.467 [2019-03-22 18:48:02]
Iter: [100/674] Freq 257.0 loss_source 0.021 loss_st 0.447 loss_ml 1227.169 loss_target 0.609 loss_total 6.399 [2019-03-22 18:50:24]
Iter: [200/674] Freq 257.9 loss_source 0.023 loss_st 0.447 loss_ml 1220.507 loss_target 0.609 loss_total 6.457 [2019-03-22 18:52:46]
Iter: [300/674] Freq 256.7 loss_source 0.023 loss_st 0.445 loss_ml 1220.899 loss_target 0.607 loss_total 6.474 [2019-03-22 18:55:10]
Iter: [400/674] Freq 256.9 loss_source 0.024 loss_st 0.445 loss_ml 1219.383 loss_target 0.607 loss_total 6.489 [2019-03-22 18:57:33]
Iter: [500/674] Freq 256.3 loss_source 0.024 loss_st 0.445 loss_ml 1217.888 loss_target 0.607 loss_total 6.516 [2019-03-22 18:59:58]
Iter: [600/674] Freq 256.6 loss_source 0.025 loss_st 0.445 loss_ml 1217.411 loss_target 0.606 loss_total 6.562 [2019-03-22 19:02:21]
Train loss_source 0.026 loss_st 0.445 loss_ml 1217.675 loss_target 0.606 loss_total 6.596

==>>[2019-03-22 19:04:22] [Epoch=009/020] Stage 1, [Need: 03:27:56]
Iter: [000/674] Freq 100.2 loss_source 0.020 loss_st 0.451 loss_ml 1134.829 loss_target 0.623 loss_total 6.340 [2019-03-22 19:04:26]
Iter: [100/674] Freq 254.8 loss_source 0.017 loss_st 0.444 loss_ml 1207.486 loss_target 0.605 loss_total 6.162 [2019-03-22 19:06:48]
Iter: [200/674] Freq 255.7 loss_source 0.021 loss_st 0.440 loss_ml 1221.733 loss_target 0.603 loss_total 6.293 [2019-03-22 19:09:11]
Iter: [300/674] Freq 255.0 loss_source 0.023 loss_st 0.440 loss_ml 1220.646 loss_target 0.603 loss_total 6.384 [2019-03-22 19:11:36]
Iter: [400/674] Freq 255.8 loss_source 0.023 loss_st 0.440 loss_ml 1222.276 loss_target 0.602 loss_total 6.404 [2019-03-22 19:13:59]
Iter: [500/674] Freq 255.6 loss_source 0.024 loss_st 0.441 loss_ml 1219.443 loss_target 0.602 loss_total 6.455 [2019-03-22 19:16:24]
Iter: [600/674] Freq 256.1 loss_source 0.024 loss_st 0.441 loss_ml 1216.231 loss_target 0.602 loss_total 6.474 [2019-03-22 19:18:46]
Train loss_source 0.025 loss_st 0.441 loss_ml 1215.436 loss_target 0.601 loss_total 6.506

==>>[2019-03-22 19:20:56] [Epoch=010/020] Stage 1, [Need: 03:06:41]
Iter: [000/674] Freq 95.5 loss_source 0.009 loss_st 0.444 loss_ml 1241.182 loss_target 0.581 loss_total 5.698 [2019-03-22 19:21:00]
Iter: [100/674] Freq 255.6 loss_source 0.020 loss_st 0.437 loss_ml 1212.417 loss_target 0.598 loss_total 6.192 [2019-03-22 19:23:21]
Iter: [200/674] Freq 256.8 loss_source 0.021 loss_st 0.437 loss_ml 1203.433 loss_target 0.598 loss_total 6.258 [2019-03-22 19:25:44]
Iter: [300/674] Freq 256.1 loss_source 0.022 loss_st 0.436 loss_ml 1202.463 loss_target 0.597 loss_total 6.280 [2019-03-22 19:28:08]
Iter: [400/674] Freq 256.6 loss_source 0.023 loss_st 0.436 loss_ml 1206.262 loss_target 0.596 loss_total 6.335 [2019-03-22 19:30:31]
Iter: [500/674] Freq 255.9 loss_source 0.024 loss_st 0.436 loss_ml 1208.317 loss_target 0.595 loss_total 6.380 [2019-03-22 19:32:56]
Iter: [600/674] Freq 256.4 loss_source 0.024 loss_st 0.437 loss_ml 1207.040 loss_target 0.594 loss_total 6.408 [2019-03-22 19:35:18]
Train loss_source 0.025 loss_st 0.437 loss_ml 1205.969 loss_target 0.593 loss_total 6.431

==>>[2019-03-22 19:37:16] [Epoch=011/020] Stage 1, [Need: 02:46:07]
Iter: [000/674] Freq 96.7 loss_source 0.005 loss_st 0.443 loss_ml 1034.583 loss_target 0.585 loss_total 5.487 [2019-03-22 19:37:20]
Iter: [100/674] Freq 253.5 loss_source 0.018 loss_st 0.434 loss_ml 1207.993 loss_target 0.585 loss_total 6.059 [2019-03-22 19:39:43]
Iter: [200/674] Freq 255.7 loss_source 0.020 loss_st 0.433 loss_ml 1191.691 loss_target 0.586 loss_total 6.141 [2019-03-22 19:42:06]
Iter: [300/674] Freq 255.3 loss_source 0.020 loss_st 0.433 loss_ml 1195.387 loss_target 0.584 loss_total 6.163 [2019-03-22 19:44:30]
Iter: [400/674] Freq 256.1 loss_source 0.022 loss_st 0.433 loss_ml 1194.847 loss_target 0.582 loss_total 6.231 [2019-03-22 19:46:52]
Iter: [500/674] Freq 256.1 loss_source 0.022 loss_st 0.433 loss_ml 1197.727 loss_target 0.580 loss_total 6.273 [2019-03-22 19:49:16]
Iter: [600/674] Freq 256.5 loss_source 0.024 loss_st 0.433 loss_ml 1196.566 loss_target 0.580 loss_total 6.332 [2019-03-22 19:51:38]
Train loss_source 0.024 loss_st 0.434 loss_ml 1198.824 loss_target 0.578 loss_total 6.361

==>>[2019-03-22 19:53:45] [Epoch=012/020] Stage 1, [Need: 02:26:20]
Iter: [000/674] Freq 101.8 loss_source 0.022 loss_st 0.438 loss_ml 939.244 loss_target 0.556 loss_total 6.194 [2019-03-22 19:53:48]
Iter: [100/674] Freq 254.0 loss_source 0.020 loss_st 0.432 loss_ml 1179.875 loss_target 0.575 loss_total 6.143 [2019-03-22 19:56:11]
Iter: [200/674] Freq 255.7 loss_source 0.020 loss_st 0.430 loss_ml 1177.946 loss_target 0.572 loss_total 6.093 [2019-03-22 19:58:34]
Iter: [300/674] Freq 255.2 loss_source 0.019 loss_st 0.429 loss_ml 1180.694 loss_target 0.572 loss_total 6.025 [2019-03-22 20:00:59]
Iter: [400/674] Freq 256.4 loss_source 0.019 loss_st 0.427 loss_ml 1181.602 loss_target 0.571 loss_total 6.005 [2019-03-22 20:03:20]
Iter: [500/674] Freq 256.5 loss_source 0.018 loss_st 0.426 loss_ml 1177.473 loss_target 0.570 loss_total 5.979 [2019-03-22 20:05:43]
Iter: [600/674] Freq 256.8 loss_source 0.018 loss_st 0.425 loss_ml 1177.267 loss_target 0.569 loss_total 5.970 [2019-03-22 20:08:06]
Train loss_source 0.018 loss_st 0.424 loss_ml 1175.595 loss_target 0.569 loss_total 5.961

==>>[2019-03-22 20:10:14] [Epoch=013/020] Stage 1, [Need: 02:07:04]
Iter: [000/674] Freq 94.6 loss_source 0.004 loss_st 0.412 loss_ml 1294.173 loss_target 0.561 loss_total 5.153 [2019-03-22 20:10:18]
Iter: [100/674] Freq 253.2 loss_source 0.013 loss_st 0.418 loss_ml 1169.699 loss_target 0.564 loss_total 5.608 [2019-03-22 20:12:41]
Iter: [200/674] Freq 255.8 loss_source 0.015 loss_st 0.418 loss_ml 1170.560 loss_target 0.564 loss_total 5.728 [2019-03-22 20:15:03]
Iter: [300/674] Freq 254.8 loss_source 0.014 loss_st 0.418 loss_ml 1174.406 loss_target 0.565 loss_total 5.697 [2019-03-22 20:17:29]
Iter: [400/674] Freq 255.4 loss_source 0.015 loss_st 0.417 loss_ml 1177.647 loss_target 0.564 loss_total 5.702 [2019-03-22 20:19:52]
Iter: [500/674] Freq 254.8 loss_source 0.015 loss_st 0.417 loss_ml 1179.175 loss_target 0.562 loss_total 5.704 [2019-03-22 20:22:18]
Iter: [600/674] Freq 255.3 loss_source 0.014 loss_st 0.416 loss_ml 1177.439 loss_target 0.562 loss_total 5.685 [2019-03-22 20:24:41]
Train loss_source 0.015 loss_st 0.416 loss_ml 1176.609 loss_target 0.563 loss_total 5.690

==>>[2019-03-22 20:26:48] [Epoch=014/020] Stage 1, [Need: 01:48:14]
Iter: [000/674] Freq 109.1 loss_source 0.026 loss_st 0.406 loss_ml 1144.369 loss_target 0.562 loss_total 6.168 [2019-03-22 20:26:51]
Iter: [100/674] Freq 255.9 loss_source 0.013 loss_st 0.414 loss_ml 1185.118 loss_target 0.558 loss_total 5.560 [2019-03-22 20:29:13]
Iter: [200/674] Freq 258.3 loss_source 0.013 loss_st 0.414 loss_ml 1179.298 loss_target 0.558 loss_total 5.574 [2019-03-22 20:31:34]
Iter: [300/674] Freq 257.7 loss_source 0.013 loss_st 0.414 loss_ml 1172.386 loss_target 0.561 loss_total 5.575 [2019-03-22 20:33:58]
Iter: [400/674] Freq 258.7 loss_source 0.013 loss_st 0.414 loss_ml 1176.415 loss_target 0.561 loss_total 5.585 [2019-03-22 20:36:18]
Iter: [500/674] Freq 257.9 loss_source 0.013 loss_st 0.414 loss_ml 1178.101 loss_target 0.560 loss_total 5.586 [2019-03-22 20:38:43]
Iter: [600/674] Freq 257.9 loss_source 0.013 loss_st 0.414 loss_ml 1177.484 loss_target 0.559 loss_total 5.587 [2019-03-22 20:41:05]
Train loss_source 0.013 loss_st 0.413 loss_ml 1176.816 loss_target 0.559 loss_total 5.576

==>>[2019-03-22 20:43:13] [Epoch=015/020] Stage 1, [Need: 01:29:39]
Iter: [000/674] Freq 99.6 loss_source 0.016 loss_st 0.417 loss_ml 1056.368 loss_target 0.570 loss_total 5.779 [2019-03-22 20:43:16]
Iter: [100/674] Freq 255.1 loss_source 0.011 loss_st 0.413 loss_ml 1159.682 loss_target 0.557 loss_total 5.495 [2019-03-22 20:45:38]
Iter: [200/674] Freq 257.2 loss_source 0.011 loss_st 0.412 loss_ml 1159.484 loss_target 0.556 loss_total 5.468 [2019-03-22 20:48:00]
Iter: [300/674] Freq 255.9 loss_source 0.011 loss_st 0.412 loss_ml 1165.138 loss_target 0.556 loss_total 5.469 [2019-03-22 20:50:26]
Iter: [400/674] Freq 256.6 loss_source 0.012 loss_st 0.412 loss_ml 1165.366 loss_target 0.555 loss_total 5.490 [2019-03-22 20:52:48]
Iter: [500/674] Freq 255.8 loss_source 0.012 loss_st 0.412 loss_ml 1163.311 loss_target 0.554 loss_total 5.496 [2019-03-22 20:55:13]
Iter: [600/674] Freq 256.6 loss_source 0.012 loss_st 0.412 loss_ml 1164.106 loss_target 0.554 loss_total 5.503 [2019-03-22 20:57:35]
Train loss_source 0.012 loss_st 0.412 loss_ml 1164.072 loss_target 0.553 loss_total 5.511

==>>[2019-03-22 20:59:41] [Epoch=016/020] Stage 1, [Need: 01:11:21]
Iter: [000/674] Freq 97.1 loss_source 0.002 loss_st 0.406 loss_ml 1106.742 loss_target 0.531 loss_total 4.905 [2019-03-22 20:59:45]
Iter: [100/674] Freq 254.9 loss_source 0.009 loss_st 0.409 loss_ml 1171.639 loss_target 0.552 loss_total 5.326 [2019-03-22 21:02:07]
Iter: [200/674] Freq 257.1 loss_source 0.010 loss_st 0.409 loss_ml 1158.995 loss_target 0.550 loss_total 5.372 [2019-03-22 21:04:29]
Iter: [300/674] Freq 256.3 loss_source 0.010 loss_st 0.410 loss_ml 1156.030 loss_target 0.549 loss_total 5.390 [2019-03-22 21:06:54]
Iter: [400/674] Freq 258.6 loss_source 0.011 loss_st 0.410 loss_ml 1158.592 loss_target 0.550 loss_total 5.417 [2019-03-22 21:09:12]
Iter: [500/674] Freq 260.1 loss_source 0.011 loss_st 0.410 loss_ml 1158.954 loss_target 0.549 loss_total 5.441 [2019-03-22 21:11:30]
Iter: [600/674] Freq 260.0 loss_source 0.011 loss_st 0.410 loss_ml 1159.122 loss_target 0.549 loss_total 5.426 [2019-03-22 21:13:52]
Train loss_source 0.011 loss_st 0.410 loss_ml 1163.398 loss_target 0.549 loss_total 5.437

==>>[2019-03-22 21:15:57] [Epoch=017/020] Stage 1, [Need: 00:53:14]
Iter: [000/674] Freq 100.9 loss_source 0.006 loss_st 0.400 loss_ml 1040.252 loss_target 0.565 loss_total 5.063 [2019-03-22 21:16:00]
Iter: [100/674] Freq 253.6 loss_source 0.011 loss_st 0.408 loss_ml 1169.662 loss_target 0.543 loss_total 5.407 [2019-03-22 21:18:23]
Iter: [200/674] Freq 255.6 loss_source 0.011 loss_st 0.408 loss_ml 1159.324 loss_target 0.545 loss_total 5.384 [2019-03-22 21:20:46]
Iter: [300/674] Freq 255.0 loss_source 0.011 loss_st 0.408 loss_ml 1158.836 loss_target 0.546 loss_total 5.394 [2019-03-22 21:23:11]
Iter: [400/674] Freq 255.9 loss_source 0.011 loss_st 0.409 loss_ml 1158.793 loss_target 0.546 loss_total 5.399 [2019-03-22 21:25:33]
Iter: [500/674] Freq 255.4 loss_source 0.011 loss_st 0.409 loss_ml 1157.586 loss_target 0.547 loss_total 5.395 [2019-03-22 21:27:58]
Iter: [600/674] Freq 255.7 loss_source 0.010 loss_st 0.409 loss_ml 1156.044 loss_target 0.546 loss_total 5.386 [2019-03-22 21:30:22]
Train loss_source 0.010 loss_st 0.409 loss_ml 1156.436 loss_target 0.546 loss_total 5.386

==>>[2019-03-22 21:32:35] [Epoch=018/020] Stage 1, [Need: 00:35:22]
Iter: [000/674] Freq 99.1 loss_source 0.018 loss_st 0.403 loss_ml 1119.897 loss_target 0.563 loss_total 5.710 [2019-03-22 21:32:39]
Iter: [100/674] Freq 254.1 loss_source 0.012 loss_st 0.409 loss_ml 1141.581 loss_target 0.540 loss_total 5.445 [2019-03-22 21:35:02]
Iter: [200/674] Freq 256.7 loss_source 0.011 loss_st 0.410 loss_ml 1155.941 loss_target 0.544 loss_total 5.444 [2019-03-22 21:37:23]
Iter: [300/674] Freq 256.1 loss_source 0.011 loss_st 0.409 loss_ml 1156.437 loss_target 0.545 loss_total 5.418 [2019-03-22 21:39:48]
Iter: [400/674] Freq 256.6 loss_source 0.011 loss_st 0.410 loss_ml 1156.753 loss_target 0.544 loss_total 5.424 [2019-03-22 21:42:10]
Iter: [500/674] Freq 255.9 loss_source 0.011 loss_st 0.409 loss_ml 1157.934 loss_target 0.545 loss_total 5.415 [2019-03-22 21:44:36]
Iter: [600/674] Freq 256.4 loss_source 0.011 loss_st 0.409 loss_ml 1160.129 loss_target 0.545 loss_total 5.403 [2019-03-22 21:46:58]
Train loss_source 0.010 loss_st 0.409 loss_ml 1160.056 loss_target 0.544 loss_total 5.391

==>>[2019-03-22 21:49:02] [Epoch=019/020] Stage 1, [Need: 00:17:37]
Iter: [000/674] Freq 100.2 loss_source 0.016 loss_st 0.392 loss_ml 1400.582 loss_target 0.528 loss_total 5.544 [2019-03-22 21:49:06]
Iter: [100/674] Freq 257.4 loss_source 0.009 loss_st 0.409 loss_ml 1170.794 loss_target 0.544 loss_total 5.321 [2019-03-22 21:51:27]
Iter: [200/674] Freq 258.2 loss_source 0.009 loss_st 0.408 loss_ml 1163.122 loss_target 0.545 loss_total 5.335 [2019-03-22 21:53:49]
Iter: [300/674] Freq 256.6 loss_source 0.010 loss_st 0.409 loss_ml 1165.629 loss_target 0.544 loss_total 5.359 [2019-03-22 21:56:14]
Iter: [400/674] Freq 256.9 loss_source 0.010 loss_st 0.409 loss_ml 1162.989 loss_target 0.544 loss_total 5.350 [2019-03-22 21:58:37]
Iter: [500/674] Freq 256.1 loss_source 0.010 loss_st 0.409 loss_ml 1163.277 loss_target 0.544 loss_total 5.353 [2019-03-22 22:01:02]
Iter: [600/674] Freq 256.8 loss_source 0.010 loss_st 0.409 loss_ml 1164.797 loss_target 0.543 loss_total 5.349 [2019-03-22 22:03:24]
Train loss_source 0.010 loss_st 0.409 loss_ml 1167.311 loss_target 0.544 loss_total 5.352
Test r1 67.013 r5 82.571 r10 87.470 MAP 39.948

@geyutang
Copy link
Author

geyutang commented Sep 5, 2019

Here is my log.I print the test results every training epoch to monitor this model.

python version : 3.6.8 |Anaconda, Inc.| (default, Dec 30 2018, 01:22:34) [GCC 7.3.0]
torch version : 1.0.0

------------------------------------------------------- options --------------------------------------------------------
batch_size: 368 beta: 0.2 crop_size: (384, 128)
epochs: 25 gpu: 0,1,2,3 img_size: (384, 128)
lamb_1: 0.0002 lamb_2: 50.0 lr: 0.0002
margin: 1.0 mining_ratio: 0.005 ml_path: ../data/ml_Market.dat
padding: 7 pretrain_path: ../data/pretrained_weight.pth print_freq: 100
resume: save_path: ../runs/debug scala_ce: 30.0
source: MSMT17 target: Market wd: 0.025

loaded pre-trained model from ../data/pretrained_weight.pth

==>>[2019-09-05 02:01:31] [Epoch=000/025] Stage 1, [Need: 00:00:00]
initializing centers/threshold ...
loaded ml from ../data/ml_Market.dat
initializing centers done.
initializing threshold done.
Iter: [000/674] Freq 19.5 loss_source 0.148 loss_st 0.453 loss_ml 107341.672 loss_target 0.000 loss_total 33.394 [2019-09-05 02:02:03]
Iter: [100/674] Freq 415.9 loss_source 0.082 loss_st 0.541 loss_ml 14521.075 loss_target 0.000 loss_total 12.413 [2019-09-05 02:03:13]
Iter: [200/674] Freq 460.7 loss_source 0.072 loss_st 0.545 loss_ml 8190.171 loss_target 0.000 loss_total 10.684 [2019-09-05 02:04:24]
Iter: [300/674] Freq 473.4 loss_source 0.068 loss_st 0.546 loss_ml 6040.499 loss_target 0.000 loss_total 10.076 [2019-09-05 02:05:38]
Iter: [400/674] Freq 482.7 loss_source 0.065 loss_st 0.545 loss_ml 4947.295 loss_target 0.000 loss_total 9.680 [2019-09-05 02:06:50]
Iter: [500/674] Freq 486.0 loss_source 0.063 loss_st 0.543 loss_ml 4287.097 loss_target 0.000 loss_total 9.435 [2019-09-05 02:08:03]
Iter: [600/674] Freq 490.4 loss_source 0.061 loss_st 0.541 loss_ml 3835.862 loss_target 0.000 loss_total 9.251 [2019-09-05 02:09:15]
Train loss_source 0.062 loss_st 0.541 loss_ml 3589.546 loss_target 0.000 loss_total 9.202
Test r1 57.126 r5 74.347 r10 80.582 MAP 31.265

==>>[2019-09-05 02:12:58] [Epoch=001/025] Stage 1, [Need: 03:27:43]
Iter: [000/674] Freq 106.0 loss_source 0.014 loss_st 0.535 loss_ml 1521.879 loss_target 0.339 loss_total 6.665 [2019-09-05 02:13:01]
Iter: [100/674] Freq 294.1 loss_source 0.032 loss_st 0.525 loss_ml 1504.258 loss_target 0.594 loss_total 7.760 [2019-09-05 02:15:04]
Iter: [200/674] Freq 293.8 loss_source 0.033 loss_st 0.522 loss_ml 1476.220 loss_target 0.610 loss_total 7.749 [2019-09-05 02:17:10]
Iter: [300/674] Freq 292.0 loss_source 0.033 loss_st 0.520 loss_ml 1467.900 loss_target 0.616 loss_total 7.770 [2019-09-05 02:19:17]
Iter: [400/674] Freq 292.2 loss_source 0.033 loss_st 0.517 loss_ml 1454.754 loss_target 0.619 loss_total 7.731 [2019-09-05 02:21:23]
Iter: [500/674] Freq 291.7 loss_source 0.034 loss_st 0.516 loss_ml 1446.384 loss_target 0.621 loss_total 7.766 [2019-09-05 02:23:30]
Iter: [600/674] Freq 292.0 loss_source 0.035 loss_st 0.515 loss_ml 1439.714 loss_target 0.621 loss_total 7.828 [2019-09-05 02:25:35]
Train loss_source 0.036 loss_st 0.515 loss_ml 1432.803 loss_target 0.622 loss_total 7.832
Test r1 60.184 r5 78.385 r10 84.382 MAP 35.182

==>>[2019-09-05 02:29:43] [Epoch=002/025] Stage 1, [Need: 04:54:53]
Iter: [000/674] Freq 122.6 loss_source 0.014 loss_st 0.500 loss_ml 1389.421 loss_target 0.646 loss_total 6.626 [2019-09-05 02:29:46]
Iter: [100/674] Freq 292.3 loss_source 0.027 loss_st 0.503 loss_ml 1393.499 loss_target 0.622 loss_total 7.298 [2019-09-05 02:31:50]
Iter: [200/674] Freq 292.7 loss_source 0.027 loss_st 0.500 loss_ml 1376.900 loss_target 0.623 loss_total 7.254 [2019-09-05 02:33:56]
Iter: [300/674] Freq 291.6 loss_source 0.028 loss_st 0.499 loss_ml 1379.441 loss_target 0.623 loss_total 7.293 [2019-09-05 02:36:03]
Iter: [400/674] Freq 291.5 loss_source 0.029 loss_st 0.498 loss_ml 1370.416 loss_target 0.622 loss_total 7.340 [2019-09-05 02:38:09]
Iter: [500/674] Freq 290.9 loss_source 0.030 loss_st 0.498 loss_ml 1365.846 loss_target 0.622 loss_total 7.371 [2019-09-05 02:40:17]
Iter: [600/674] Freq 291.1 loss_source 0.031 loss_st 0.498 loss_ml 1366.200 loss_target 0.623 loss_total 7.417 [2019-09-05 02:42:23]
Train loss_source 0.031 loss_st 0.498 loss_ml 1365.755 loss_target 0.622 loss_total 7.447
Test r1 61.520 r5 79.691 r10 85.392 MAP 36.180

==>>[2019-09-05 02:46:31] [Epoch=003/025] Stage 1, [Need: 05:11:17]
Iter: [000/674] Freq 120.9 loss_source 0.016 loss_st 0.526 loss_ml 1230.060 loss_target 0.628 loss_total 6.943 [2019-09-05 02:46:34]
Iter: [100/674] Freq 292.3 loss_source 0.023 loss_st 0.488 loss_ml 1310.503 loss_target 0.622 loss_total 6.894 [2019-09-05 02:48:38]
Iter: [200/674] Freq 293.6 loss_source 0.022 loss_st 0.486 loss_ml 1317.773 loss_target 0.622 loss_total 6.868 [2019-09-05 02:50:43]
Iter: [300/674] Freq 292.2 loss_source 0.025 loss_st 0.486 loss_ml 1313.312 loss_target 0.622 loss_total 7.009 [2019-09-05 02:52:50]
Iter: [400/674] Freq 292.5 loss_source 0.026 loss_st 0.487 loss_ml 1309.672 loss_target 0.622 loss_total 7.054 [2019-09-05 02:54:55]
Iter: [500/674] Freq 292.1 loss_source 0.027 loss_st 0.486 loss_ml 1312.223 loss_target 0.622 loss_total 7.108 [2019-09-05 02:57:02]
Iter: [600/674] Freq 292.7 loss_source 0.028 loss_st 0.486 loss_ml 1307.797 loss_target 0.621 loss_total 7.142 [2019-09-05 02:59:06]
Train loss_source 0.028 loss_st 0.486 loss_ml 1307.342 loss_target 0.621 loss_total 7.163
Test r1 63.183 r5 79.602 r10 85.897 MAP 36.790

==>>[2019-09-05 03:03:16] [Epoch=004/025] Stage 1, [Need: 05:10:36]
Iter: [000/674] Freq 122.7 loss_source 0.018 loss_st 0.497 loss_ml 1063.799 loss_target 0.644 loss_total 6.743 [2019-09-05 03:03:19]
Iter: [100/674] Freq 292.7 loss_source 0.024 loss_st 0.482 loss_ml 1297.756 loss_target 0.621 loss_total 6.908 [2019-09-05 03:05:23]
Iter: [200/674] Freq 292.9 loss_source 0.025 loss_st 0.480 loss_ml 1285.912 loss_target 0.622 loss_total 6.910 [2019-09-05 03:07:28]
Iter: [300/674] Freq 290.4 loss_source 0.025 loss_st 0.479 loss_ml 1282.544 loss_target 0.621 loss_total 6.938 [2019-09-05 03:09:37]
Iter: [400/674] Freq 290.5 loss_source 0.026 loss_st 0.478 loss_ml 1276.125 loss_target 0.622 loss_total 6.953 [2019-09-05 03:11:44]
Iter: [500/674] Freq 289.4 loss_source 0.027 loss_st 0.478 loss_ml 1275.980 loss_target 0.621 loss_total 6.995 [2019-09-05 03:13:53]
Iter: [600/674] Freq 289.3 loss_source 0.027 loss_st 0.478 loss_ml 1275.397 loss_target 0.622 loss_total 7.014 [2019-09-05 03:16:00]
Train loss_source 0.027 loss_st 0.478 loss_ml 1273.422 loss_target 0.622 loss_total 7.028
Test r1 63.539 r5 79.780 r10 85.659 MAP 37.330

==>>[2019-09-05 03:20:11] [Epoch=005/025] Stage 1, [Need: 05:04:16]
Iter: [000/674] Freq 120.3 loss_source 0.028 loss_st 0.477 loss_ml 1407.924 loss_target 0.615 loss_total 7.057 [2019-09-05 03:20:14]
Iter: [100/674] Freq 293.4 loss_source 0.022 loss_st 0.473 loss_ml 1260.428 loss_target 0.619 loss_total 6.701 [2019-09-05 03:22:18]
Iter: [200/674] Freq 293.6 loss_source 0.024 loss_st 0.471 loss_ml 1273.652 loss_target 0.619 loss_total 6.773 [2019-09-05 03:24:23]
Iter: [300/674] Freq 292.4 loss_source 0.025 loss_st 0.470 loss_ml 1270.580 loss_target 0.620 loss_total 6.818 [2019-09-05 03:26:30]
Iter: [400/674] Freq 292.6 loss_source 0.026 loss_st 0.470 loss_ml 1267.838 loss_target 0.619 loss_total 6.877 [2019-09-05 03:28:35]
Iter: [500/674] Freq 291.7 loss_source 0.027 loss_st 0.471 loss_ml 1261.826 loss_target 0.619 loss_total 6.938 [2019-09-05 03:30:43]
Iter: [600/674] Freq 291.6 loss_source 0.028 loss_st 0.472 loss_ml 1261.185 loss_target 0.619 loss_total 6.972 [2019-09-05 03:32:49]
Train loss_source 0.028 loss_st 0.472 loss_ml 1260.474 loss_target 0.619 loss_total 6.996
Test r1 64.163 r5 80.552 r10 86.461 MAP 38.031

==>>[2019-09-05 03:37:00] [Epoch=006/025] Stage 1, [Need: 04:54:09]
Iter: [000/674] Freq 120.7 loss_source 0.015 loss_st 0.446 loss_ml 1521.955 loss_target 0.611 loss_total 6.141 [2019-09-05 03:37:03]
Iter: [100/674] Freq 288.9 loss_source 0.018 loss_st 0.467 loss_ml 1240.012 loss_target 0.616 loss_total 6.423 [2019-09-05 03:39:09]
Iter: [200/674] Freq 290.6 loss_source 0.021 loss_st 0.465 loss_ml 1238.862 loss_target 0.616 loss_total 6.544 [2019-09-05 03:41:15]
Iter: [300/674] Freq 290.1 loss_source 0.023 loss_st 0.465 loss_ml 1236.390 loss_target 0.616 loss_total 6.647 [2019-09-05 03:43:22]
Iter: [400/674] Freq 290.2 loss_source 0.024 loss_st 0.465 loss_ml 1238.771 loss_target 0.616 loss_total 6.704 [2019-09-05 03:45:29]
Iter: [500/674] Freq 289.9 loss_source 0.025 loss_st 0.465 loss_ml 1239.958 loss_target 0.615 loss_total 6.752 [2019-09-05 03:47:36]
Iter: [600/674] Freq 290.6 loss_source 0.026 loss_st 0.465 loss_ml 1239.430 loss_target 0.615 loss_total 6.814 [2019-09-05 03:49:41]
Train loss_source 0.027 loss_st 0.466 loss_ml 1242.211 loss_target 0.615 loss_total 6.875
Test r1 65.202 r5 81.146 r10 86.728 MAP 38.972

==>>[2019-09-05 03:53:52] [Epoch=007/025] Stage 1, [Need: 04:42:14]
Iter: [000/674] Freq 126.2 loss_source 0.034 loss_st 0.453 loss_ml 1312.412 loss_target 0.606 loss_total 7.119 [2019-09-05 03:53:55]
Iter: [100/674] Freq 292.6 loss_source 0.026 loss_st 0.467 loss_ml 1238.827 loss_target 0.614 loss_total 6.831 [2019-09-05 03:55:59]
Iter: [200/674] Freq 291.3 loss_source 0.025 loss_st 0.463 loss_ml 1246.211 loss_target 0.613 loss_total 6.723 [2019-09-05 03:58:06]
Iter: [300/674] Freq 289.9 loss_source 0.025 loss_st 0.462 loss_ml 1241.678 loss_target 0.613 loss_total 6.710 [2019-09-05 04:00:14]
Iter: [400/674] Freq 290.1 loss_source 0.025 loss_st 0.461 loss_ml 1237.485 loss_target 0.613 loss_total 6.724 [2019-09-05 04:02:21]
Iter: [500/674] Freq 289.1 loss_source 0.026 loss_st 0.462 loss_ml 1231.633 loss_target 0.612 loss_total 6.782 [2019-09-05 04:04:30]
Iter: [600/674] Freq 289.1 loss_source 0.026 loss_st 0.461 loss_ml 1235.293 loss_target 0.612 loss_total 6.795 [2019-09-05 04:06:37]
Train loss_source 0.027 loss_st 0.461 loss_ml 1233.328 loss_target 0.612 loss_total 6.805
Test r1 65.202 r5 81.681 r10 87.292 MAP 38.911

==>>[2019-09-05 04:10:51] [Epoch=008/025] Stage 1, [Need: 04:29:18]
Iter: [000/674] Freq 125.3 loss_source 0.006 loss_st 0.439 loss_ml 1131.262 loss_target 0.587 loss_total 5.479 [2019-09-05 04:10:54]
Iter: [100/674] Freq 282.2 loss_source 0.021 loss_st 0.456 loss_ml 1229.263 loss_target 0.610 loss_total 6.494 [2019-09-05 04:13:03]
Iter: [200/674] Freq 279.1 loss_source 0.021 loss_st 0.455 loss_ml 1220.616 loss_target 0.610 loss_total 6.474 [2019-09-05 04:15:16]
Iter: [300/674] Freq 279.5 loss_source 0.022 loss_st 0.455 loss_ml 1230.407 loss_target 0.609 loss_total 6.521 [2019-09-05 04:17:27]
Iter: [400/674] Freq 282.9 loss_source 0.025 loss_st 0.455 loss_ml 1230.208 loss_target 0.609 loss_total 6.638 [2019-09-05 04:19:33]
Iter: [500/674] Freq 284.3 loss_source 0.025 loss_st 0.455 loss_ml 1226.861 loss_target 0.609 loss_total 6.675 [2019-09-05 04:21:40]
Iter: [600/674] Freq 285.9 loss_source 0.026 loss_st 0.455 loss_ml 1222.344 loss_target 0.608 loss_total 6.715 [2019-09-05 04:23:45]
Train loss_source 0.026 loss_st 0.455 loss_ml 1220.299 loss_target 0.608 loss_total 6.728
Test r1 65.707 r5 81.354 r10 86.847 MAP 38.892

==>>[2019-09-05 04:27:54] [Epoch=009/025] Stage 1, [Need: 04:15:38]
Iter: [000/674] Freq 124.6 loss_source 0.030 loss_st 0.459 loss_ml 1526.340 loss_target 0.627 loss_total 7.013 [2019-09-05 04:27:57]
Iter: [100/674] Freq 294.2 loss_source 0.021 loss_st 0.453 loss_ml 1221.126 loss_target 0.604 loss_total 6.409 [2019-09-05 04:30:00]
Iter: [200/674] Freq 294.5 loss_source 0.019 loss_st 0.451 loss_ml 1208.823 loss_target 0.605 loss_total 6.304 [2019-09-05 04:32:05]
Iter: [300/674] Freq 293.1 loss_source 0.020 loss_st 0.450 loss_ml 1213.510 loss_target 0.603 loss_total 6.356 [2019-09-05 04:34:12]
Iter: [400/674] Freq 293.3 loss_source 0.024 loss_st 0.450 loss_ml 1219.003 loss_target 0.602 loss_total 6.532 [2019-09-05 04:36:17]
Iter: [500/674] Freq 292.1 loss_source 0.025 loss_st 0.450 loss_ml 1214.814 loss_target 0.602 loss_total 6.577 [2019-09-05 04:38:25]
Iter: [600/674] Freq 292.2 loss_source 0.025 loss_st 0.450 loss_ml 1211.396 loss_target 0.601 loss_total 6.611 [2019-09-05 04:40:31]
Train loss_source 0.026 loss_st 0.450 loss_ml 1211.937 loss_target 0.601 loss_total 6.649
Test r1 65.885 r5 82.126 r10 87.648 MAP 39.534

==>>[2019-09-05 04:44:40] [Epoch=010/025] Stage 1, [Need: 04:00:52]
Iter: [000/674] Freq 121.9 loss_source 0.009 loss_st 0.431 loss_ml 1214.655 loss_target 0.583 loss_total 5.600 [2019-09-05 04:44:43]
Iter: [100/674] Freq 293.9 loss_source 0.022 loss_st 0.446 loss_ml 1200.431 loss_target 0.594 loss_total 6.387 [2019-09-05 04:46:47]
Iter: [200/674] Freq 294.1 loss_source 0.022 loss_st 0.446 loss_ml 1210.144 loss_target 0.594 loss_total 6.392 [2019-09-05 04:48:52]
Iter: [300/674] Freq 292.5 loss_source 0.022 loss_st 0.446 loss_ml 1204.591 loss_target 0.593 loss_total 6.383 [2019-09-05 04:50:59]
Iter: [400/674] Freq 292.5 loss_source 0.023 loss_st 0.446 loss_ml 1204.231 loss_target 0.592 loss_total 6.433 [2019-09-05 04:53:05]
Iter: [500/674] Freq 291.7 loss_source 0.025 loss_st 0.447 loss_ml 1200.560 loss_target 0.591 loss_total 6.525 [2019-09-05 04:55:12]
Iter: [600/674] Freq 291.7 loss_source 0.025 loss_st 0.447 loss_ml 1200.708 loss_target 0.591 loss_total 6.576 [2019-09-05 04:57:18]
Train loss_source 0.026 loss_st 0.448 loss_ml 1201.046 loss_target 0.590 loss_total 6.623
Test r1 64.994 r5 80.909 r10 86.728 MAP 38.601

==>>[2019-09-05 05:01:27] [Epoch=011/025] Stage 1, [Need: 03:45:44]
Iter: [000/674] Freq 120.5 loss_source 0.050 loss_st 0.451 loss_ml 1152.694 loss_target 0.597 loss_total 7.837 [2019-09-05 05:01:30]
Iter: [100/674] Freq 288.2 loss_source 0.021 loss_st 0.441 loss_ml 1216.046 loss_target 0.581 loss_total 6.263 [2019-09-05 05:03:36]
Iter: [200/674] Freq 285.5 loss_source 0.021 loss_st 0.440 loss_ml 1199.380 loss_target 0.582 loss_total 6.260 [2019-09-05 05:05:46]
Iter: [300/674] Freq 283.4 loss_source 0.022 loss_st 0.441 loss_ml 1190.041 loss_target 0.580 loss_total 6.325 [2019-09-05 05:07:58]
Iter: [400/674] Freq 283.3 loss_source 0.024 loss_st 0.442 loss_ml 1191.328 loss_target 0.579 loss_total 6.427 [2019-09-05 05:10:07]
Iter: [500/674] Freq 282.6 loss_source 0.024 loss_st 0.443 loss_ml 1192.429 loss_target 0.578 loss_total 6.466 [2019-09-05 05:12:19]
Iter: [600/674] Freq 283.4 loss_source 0.025 loss_st 0.444 loss_ml 1190.745 loss_target 0.576 loss_total 6.510 [2019-09-05 05:14:27]
Train loss_source 0.026 loss_st 0.444 loss_ml 1190.258 loss_target 0.575 loss_total 6.540
Test r1 65.202 r5 80.819 r10 86.372 MAP 38.904

==>>[2019-09-05 05:18:43] [Epoch=012/025] Stage 1, [Need: 03:30:46]
Iter: [000/674] Freq 125.5 loss_source 0.008 loss_st 0.459 loss_ml 1381.678 loss_target 0.535 loss_total 5.825 [2019-09-05 05:18:46]
Iter: [100/674] Freq 290.4 loss_source 0.017 loss_st 0.438 loss_ml 1204.479 loss_target 0.564 loss_total 6.044 [2019-09-05 05:20:51]
Iter: [200/674] Freq 289.9 loss_source 0.020 loss_st 0.439 loss_ml 1182.258 loss_target 0.560 loss_total 6.198 [2019-09-05 05:22:59]
Iter: [300/674] Freq 288.4 loss_source 0.022 loss_st 0.439 loss_ml 1178.806 loss_target 0.559 loss_total 6.303 [2019-09-05 05:25:07]
Iter: [400/674] Freq 288.2 loss_source 0.024 loss_st 0.439 loss_ml 1175.696 loss_target 0.557 loss_total 6.380 [2019-09-05 05:27:15]
Iter: [500/674] Freq 287.1 loss_source 0.025 loss_st 0.440 loss_ml 1179.934 loss_target 0.556 loss_total 6.461 [2019-09-05 05:29:26]
Iter: [600/674] Freq 287.2 loss_source 0.026 loss_st 0.440 loss_ml 1179.087 loss_target 0.554 loss_total 6.490 [2019-09-05 05:31:33]
Train loss_source 0.026 loss_st 0.441 loss_ml 1178.587 loss_target 0.553 loss_total 6.519
Test r1 64.816 r5 79.958 r10 86.342 MAP 38.208

==>>[2019-09-05 05:35:50] [Epoch=013/025] Stage 1, [Need: 03:15:22]
Iter: [000/674] Freq 121.3 loss_source 0.004 loss_st 0.433 loss_ml 1046.657 loss_target 0.523 loss_total 5.237 [2019-09-05 05:35:53]
Iter: [100/674] Freq 287.9 loss_source 0.018 loss_st 0.437 loss_ml 1183.285 loss_target 0.540 loss_total 6.038 [2019-09-05 05:37:59]
Iter: [200/674] Freq 289.3 loss_source 0.019 loss_st 0.436 loss_ml 1186.459 loss_target 0.539 loss_total 6.102 [2019-09-05 05:40:05]
Iter: [300/674] Freq 287.6 loss_source 0.021 loss_st 0.435 loss_ml 1181.144 loss_target 0.534 loss_total 6.193 [2019-09-05 05:42:15]
Iter: [400/674] Freq 288.0 loss_source 0.022 loss_st 0.435 loss_ml 1176.627 loss_target 0.532 loss_total 6.220 [2019-09-05 05:44:22]
Iter: [500/674] Freq 287.1 loss_source 0.023 loss_st 0.435 loss_ml 1174.361 loss_target 0.530 loss_total 6.259 [2019-09-05 05:46:32]
Iter: [600/674] Freq 287.3 loss_source 0.024 loss_st 0.436 loss_ml 1172.166 loss_target 0.528 loss_total 6.316 [2019-09-05 05:48:40]
Train loss_source 0.025 loss_st 0.437 loss_ml 1175.895 loss_target 0.527 loss_total 6.379
Test r1 63.124 r5 80.048 r10 85.540 MAP 37.708

==>>[2019-09-05 05:52:56] [Epoch=014/025] Stage 1, [Need: 02:59:43]
Iter: [000/674] Freq 121.3 loss_source 0.008 loss_st 0.449 loss_ml 1184.977 loss_target 0.475 loss_total 5.627 [2019-09-05 05:52:59]
Iter: [100/674] Freq 289.2 loss_source 0.023 loss_st 0.439 loss_ml 1170.653 loss_target 0.513 loss_total 6.309 [2019-09-05 05:55:05]
Iter: [200/674] Freq 289.8 loss_source 0.022 loss_st 0.438 loss_ml 1172.782 loss_target 0.514 loss_total 6.241 [2019-09-05 05:57:11]
Iter: [300/674] Freq 287.8 loss_source 0.022 loss_st 0.436 loss_ml 1166.921 loss_target 0.515 loss_total 6.233 [2019-09-05 05:59:21]
Iter: [400/674] Freq 287.9 loss_source 0.024 loss_st 0.436 loss_ml 1168.387 loss_target 0.514 loss_total 6.305 [2019-09-05 06:01:29]
Iter: [500/674] Freq 287.4 loss_source 0.026 loss_st 0.437 loss_ml 1166.607 loss_target 0.512 loss_total 6.401 [2019-09-05 06:03:38]
Iter: [600/674] Freq 285.2 loss_source 0.026 loss_st 0.438 loss_ml 1168.886 loss_target 0.510 loss_total 6.442 [2019-09-05 06:05:51]
Train loss_source 0.027 loss_st 0.439 loss_ml 1171.150 loss_target 0.509 loss_total 6.479
Test r1 63.747 r5 79.780 r10 85.303 MAP 37.723

==>>[2019-09-05 06:10:23] [Epoch=015/025] Stage 1, [Need: 02:44:01]
Iter: [000/674] Freq 121.3 loss_source 0.038 loss_st 0.444 loss_ml 1147.247 loss_target 0.469 loss_total 7.057 [2019-09-05 06:10:26]
Iter: [100/674] Freq 288.5 loss_source 0.023 loss_st 0.440 loss_ml 1152.055 loss_target 0.494 loss_total 6.292 [2019-09-05 06:12:32]
Iter: [200/674] Freq 289.7 loss_source 0.022 loss_st 0.436 loss_ml 1149.978 loss_target 0.489 loss_total 6.159 [2019-09-05 06:14:39]
Iter: [300/674] Freq 287.8 loss_source 0.021 loss_st 0.433 loss_ml 1146.944 loss_target 0.490 loss_total 6.077 [2019-09-05 06:16:48]
Iter: [400/674] Freq 288.1 loss_source 0.020 loss_st 0.431 loss_ml 1144.225 loss_target 0.490 loss_total 6.014 [2019-09-05 06:18:56]
Iter: [500/674] Freq 285.8 loss_source 0.019 loss_st 0.429 loss_ml 1148.020 loss_target 0.489 loss_total 5.974 [2019-09-05 06:21:08]
Iter: [600/674] Freq 283.7 loss_source 0.019 loss_st 0.428 loss_ml 1151.272 loss_target 0.490 loss_total 5.939 [2019-09-05 06:23:23]
Train loss_source 0.019 loss_st 0.427 loss_ml 1149.527 loss_target 0.489 loss_total 5.914
Test r1 65.529 r5 81.473 r10 86.847 MAP 39.189

==>>[2019-09-05 06:27:57] [Epoch=016/025] Stage 1, [Need: 02:28:15]
Iter: [000/674] Freq 114.9 loss_source 0.010 loss_st 0.408 loss_ml 1023.865 loss_target 0.487 loss_total 5.287 [2019-09-05 06:28:00]
Iter: [100/674] Freq 273.6 loss_source 0.014 loss_st 0.419 loss_ml 1140.247 loss_target 0.486 loss_total 5.621 [2019-09-05 06:30:13]
Iter: [200/674] Freq 279.8 loss_source 0.015 loss_st 0.418 loss_ml 1145.540 loss_target 0.482 loss_total 5.645 [2019-09-05 06:32:21]
Iter: [300/674] Freq 280.0 loss_source 0.015 loss_st 0.416 loss_ml 1137.363 loss_target 0.483 loss_total 5.619 [2019-09-05 06:34:32]
Iter: [400/674] Freq 281.2 loss_source 0.015 loss_st 0.416 loss_ml 1135.816 loss_target 0.483 loss_total 5.630 [2019-09-05 06:36:42]
Iter: [500/674] Freq 280.8 loss_source 0.015 loss_st 0.415 loss_ml 1138.010 loss_target 0.482 loss_total 5.614 [2019-09-05 06:38:54]
Iter: [600/674] Freq 281.7 loss_source 0.015 loss_st 0.415 loss_ml 1137.984 loss_target 0.482 loss_total 5.598 [2019-09-05 06:41:02]
Train loss_source 0.015 loss_st 0.414 loss_ml 1139.852 loss_target 0.481 loss_total 5.597
Test r1 65.677 r5 81.740 r10 86.847 MAP 39.087

==>>[2019-09-05 06:45:23] [Epoch=017/025] Stage 1, [Need: 02:12:17]
Iter: [000/674] Freq 118.8 loss_source 0.016 loss_st 0.428 loss_ml 926.788 loss_target 0.492 loss_total 5.752 [2019-09-05 06:45:26]
Iter: [100/674] Freq 288.4 loss_source 0.013 loss_st 0.412 loss_ml 1132.163 loss_target 0.475 loss_total 5.466 [2019-09-05 06:47:32]
Iter: [200/674] Freq 288.4 loss_source 0.013 loss_st 0.411 loss_ml 1135.591 loss_target 0.476 loss_total 5.464 [2019-09-05 06:49:40]
Iter: [300/674] Freq 287.5 loss_source 0.013 loss_st 0.412 loss_ml 1139.598 loss_target 0.477 loss_total 5.456 [2019-09-05 06:51:49]
Iter: [400/674] Freq 287.7 loss_source 0.013 loss_st 0.411 loss_ml 1139.779 loss_target 0.477 loss_total 5.465 [2019-09-05 06:53:56]
Iter: [500/674] Freq 287.1 loss_source 0.013 loss_st 0.411 loss_ml 1136.897 loss_target 0.476 loss_total 5.452 [2019-09-05 06:56:05]
Iter: [600/674] Freq 287.1 loss_source 0.013 loss_st 0.411 loss_ml 1135.780 loss_target 0.476 loss_total 5.450 [2019-09-05 06:58:14]
Train loss_source 0.013 loss_st 0.411 loss_ml 1135.029 loss_target 0.476 loss_total 5.450
Test r1 65.796 r5 81.413 r10 86.847 MAP 39.089

==>>[2019-09-05 07:02:37] [Epoch=018/025] Stage 1, [Need: 01:56:00]
Iter: [000/674] Freq 118.5 loss_source 0.013 loss_st 0.391 loss_ml 1130.145 loss_target 0.454 loss_total 5.232 [2019-09-05 07:02:40]
Iter: [100/674] Freq 286.3 loss_source 0.010 loss_st 0.408 loss_ml 1139.121 loss_target 0.475 loss_total 5.296 [2019-09-05 07:04:46]
Iter: [200/674] Freq 286.7 loss_source 0.011 loss_st 0.408 loss_ml 1135.745 loss_target 0.473 loss_total 5.325 [2019-09-05 07:06:55]
Iter: [300/674] Freq 285.1 loss_source 0.011 loss_st 0.408 loss_ml 1133.262 loss_target 0.472 loss_total 5.309 [2019-09-05 07:09:05]
Iter: [400/674] Freq 285.0 loss_source 0.011 loss_st 0.407 loss_ml 1134.550 loss_target 0.472 loss_total 5.330 [2019-09-05 07:11:14]
Iter: [500/674] Freq 285.2 loss_source 0.011 loss_st 0.407 loss_ml 1133.445 loss_target 0.472 loss_total 5.330 [2019-09-05 07:13:23]
Iter: [600/674] Freq 285.8 loss_source 0.011 loss_st 0.408 loss_ml 1134.646 loss_target 0.471 loss_total 5.348 [2019-09-05 07:15:30]
Train loss_source 0.012 loss_st 0.408 loss_ml 1132.227 loss_target 0.471 loss_total 5.360
Test r1 65.321 r5 81.354 r10 86.401 MAP 39.000

==>>[2019-09-05 07:19:47] [Epoch=019/025] Stage 1, [Need: 01:39:39]
Iter: [000/674] Freq 118.9 loss_source 0.003 loss_st 0.406 loss_ml 1205.886 loss_target 0.482 loss_total 4.934 [2019-09-05 07:19:50]
Iter: [100/674] Freq 287.4 loss_source 0.010 loss_st 0.406 loss_ml 1120.959 loss_target 0.463 loss_total 5.233 [2019-09-05 07:21:57]
Iter: [200/674] Freq 287.3 loss_source 0.010 loss_st 0.406 loss_ml 1130.164 loss_target 0.465 loss_total 5.279 [2019-09-05 07:24:05]
Iter: [300/674] Freq 285.9 loss_source 0.011 loss_st 0.407 loss_ml 1127.780 loss_target 0.468 loss_total 5.315 [2019-09-05 07:26:15]
Iter: [400/674] Freq 286.0 loss_source 0.011 loss_st 0.407 loss_ml 1125.763 loss_target 0.467 loss_total 5.314 [2019-09-05 07:28:23]
Iter: [500/674] Freq 285.6 loss_source 0.011 loss_st 0.406 loss_ml 1123.133 loss_target 0.467 loss_total 5.310 [2019-09-05 07:30:33]
Iter: [600/674] Freq 285.9 loss_source 0.011 loss_st 0.406 loss_ml 1123.983 loss_target 0.468 loss_total 5.310 [2019-09-05 07:32:41]
Train loss_source 0.011 loss_st 0.406 loss_ml 1126.300 loss_target 0.468 loss_total 5.306
Test r1 65.855 r5 81.591 r10 86.550 MAP 38.904

==>>[2019-09-05 07:37:00] [Epoch=020/025] Stage 1, [Need: 01:23:11]
Iter: [000/674] Freq 119.3 loss_source 0.002 loss_st 0.397 loss_ml 1192.137 loss_target 0.540 loss_total 4.833 [2019-09-05 07:37:03]
Iter: [100/674] Freq 288.4 loss_source 0.010 loss_st 0.405 loss_ml 1132.062 loss_target 0.465 loss_total 5.265 [2019-09-05 07:39:09]
Iter: [200/674] Freq 288.9 loss_source 0.011 loss_st 0.406 loss_ml 1127.318 loss_target 0.466 loss_total 5.292 [2019-09-05 07:41:16]
Iter: [300/674] Freq 286.7 loss_source 0.011 loss_st 0.405 loss_ml 1132.796 loss_target 0.465 loss_total 5.266 [2019-09-05 07:43:26]
Iter: [400/674] Freq 287.4 loss_source 0.010 loss_st 0.405 loss_ml 1129.447 loss_target 0.465 loss_total 5.256 [2019-09-05 07:45:33]
Iter: [500/674] Freq 286.7 loss_source 0.011 loss_st 0.404 loss_ml 1131.136 loss_target 0.463 loss_total 5.264 [2019-09-05 07:47:43]
Iter: [600/674] Freq 286.8 loss_source 0.011 loss_st 0.404 loss_ml 1128.841 loss_target 0.464 loss_total 5.273 [2019-09-05 07:49:51]
Train loss_source 0.011 loss_st 0.405 loss_ml 1126.117 loss_target 0.462 loss_total 5.279
Test r1 65.261 r5 81.621 r10 86.283 MAP 38.840

==>>[2019-09-05 07:54:11] [Epoch=021/025] Stage 1, [Need: 01:06:39]
Iter: [000/674] Freq 114.5 loss_source 0.009 loss_st 0.394 loss_ml 1029.812 loss_target 0.507 loss_total 5.103 [2019-09-05 07:54:14]
Iter: [100/674] Freq 284.6 loss_source 0.009 loss_st 0.404 loss_ml 1122.506 loss_target 0.459 loss_total 5.192 [2019-09-05 07:56:21]
Iter: [200/674] Freq 285.0 loss_source 0.009 loss_st 0.403 loss_ml 1123.324 loss_target 0.458 loss_total 5.158 [2019-09-05 07:58:30]
Iter: [300/674] Freq 283.4 loss_source 0.009 loss_st 0.403 loss_ml 1120.023 loss_target 0.460 loss_total 5.168 [2019-09-05 08:00:42]
Iter: [400/674] Freq 284.4 loss_source 0.009 loss_st 0.403 loss_ml 1119.060 loss_target 0.459 loss_total 5.179 [2019-09-05 08:02:50]
Iter: [500/674] Freq 284.2 loss_source 0.009 loss_st 0.404 loss_ml 1118.471 loss_target 0.461 loss_total 5.185 [2019-09-05 08:05:00]
Iter: [600/674] Freq 284.7 loss_source 0.010 loss_st 0.403 loss_ml 1120.585 loss_target 0.461 loss_total 5.195 [2019-09-05 08:07:08]
Train loss_source 0.010 loss_st 0.403 loss_ml 1124.796 loss_target 0.461 loss_total 5.202
Test r1 65.261 r5 81.324 r10 86.669 MAP 38.928

==>>[2019-09-05 08:11:22] [Epoch=022/025] Stage 1, [Need: 00:50:04]
Iter: [000/674] Freq 119.4 loss_source 0.003 loss_st 0.412 loss_ml 1158.009 loss_target 0.477 loss_total 4.961 [2019-09-05 08:11:25]
Iter: [100/674] Freq 284.2 loss_source 0.010 loss_st 0.404 loss_ml 1116.520 loss_target 0.461 loss_total 5.221 [2019-09-05 08:13:33]
Iter: [200/674] Freq 284.6 loss_source 0.010 loss_st 0.404 loss_ml 1122.617 loss_target 0.462 loss_total 5.221 [2019-09-05 08:15:42]
Iter: [300/674] Freq 282.8 loss_source 0.010 loss_st 0.403 loss_ml 1124.905 loss_target 0.462 loss_total 5.227 [2019-09-05 08:17:53]
Iter: [400/674] Freq 282.8 loss_source 0.010 loss_st 0.403 loss_ml 1125.443 loss_target 0.464 loss_total 5.209 [2019-09-05 08:20:04]
Iter: [500/674] Freq 281.5 loss_source 0.010 loss_st 0.403 loss_ml 1123.060 loss_target 0.462 loss_total 5.214 [2019-09-05 08:22:17]
Iter: [600/674] Freq 282.1 loss_source 0.010 loss_st 0.403 loss_ml 1122.078 loss_target 0.461 loss_total 5.203 [2019-09-05 08:24:26]
Train loss_source 0.010 loss_st 0.403 loss_ml 1126.590 loss_target 0.462 loss_total 5.196
Test r1 65.439 r5 81.740 r10 86.461 MAP 39.126

==>>[2019-09-05 08:28:43] [Epoch=023/025] Stage 1, [Need: 00:33:26]
Iter: [000/674] Freq 116.7 loss_source 0.030 loss_st 0.401 loss_ml 1057.092 loss_target 0.471 loss_total 6.201 [2019-09-05 08:28:46]
Iter: [100/674] Freq 290.0 loss_source 0.010 loss_st 0.402 loss_ml 1119.338 loss_target 0.465 loss_total 5.234 [2019-09-05 08:30:51]
Iter: [200/674] Freq 289.9 loss_source 0.010 loss_st 0.403 loss_ml 1123.643 loss_target 0.461 loss_total 5.213 [2019-09-05 08:32:58]
Iter: [300/674] Freq 287.6 loss_source 0.009 loss_st 0.402 loss_ml 1118.416 loss_target 0.462 loss_total 5.178 [2019-09-05 08:35:08]
Iter: [400/674] Freq 287.4 loss_source 0.009 loss_st 0.403 loss_ml 1120.519 loss_target 0.463 loss_total 5.187 [2019-09-05 08:37:17]
Iter: [500/674] Freq 286.3 loss_source 0.010 loss_st 0.403 loss_ml 1125.605 loss_target 0.463 loss_total 5.194 [2019-09-05 08:39:27]
Iter: [600/674] Freq 286.5 loss_source 0.010 loss_st 0.403 loss_ml 1124.774 loss_target 0.462 loss_total 5.189 [2019-09-05 08:41:35]
Train loss_source 0.009 loss_st 0.403 loss_ml 1127.991 loss_target 0.463 loss_total 5.181
Test r1 65.796 r5 81.651 r10 86.639 MAP 39.078

The parameter is the same as yours, except for the training epoch. this may affect the LR scheduler. In addition, I found another big difference is the reference dataset, I download the MSMT17_V2 dataset, and this may differ from yours. And the MSMT17 dataset download link is broken, so to verify of the difference between these two versions is unrealistic. I guess this dataset mismatch leads to the result degrade.

Thanks again for your kindly reply!

@KovenYu
Copy link
Owner

KovenYu commented Sep 8, 2019

Yeah. Probably due to dataset shift. Anyway, a difference of less than 1.5%/1% in rank-1/MAP may be expected since the statistic variation could lead to a shift in optimal parameters.

@KovenYu KovenYu closed this as completed Sep 8, 2019
@geyutang
Copy link
Author

geyutang commented Sep 9, 2019

Got it, Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants