You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! when you use fxml.py delicious.model deliciousLarge_train.txt --standard-dataset --verbose train --iters 5 --trees 20 --label-weight propensity --alpha 1e-4 --leaf-classifiers --no-remap-labels to train and use fxml.py delicious.model deliciousLarge_test.txt --standard-dataset inference to test,
what is the final result?
I get very low score, like that: P@1: 0.4287878787878788 P@3: 0.38484848484848483 P@5: 0.3565656565656566 NDCG@1: 0.4287878787878788 NDCG@3: 0.3957641604407556 NDCG@5: 0.3747889067931902 pNDCG@1: 0.45656907373737377 pNDCG@3: 0.41984043916568803 pNDCG@5: 0.396182084153063
The text was updated successfully, but these errors were encountered:
Hi! when you use
fxml.py delicious.model deliciousLarge_train.txt --standard-dataset --verbose train --iters 5 --trees 20 --label-weight propensity --alpha 1e-4 --leaf-classifiers --no-remap-labels
to train and usefxml.py delicious.model deliciousLarge_test.txt --standard-dataset inference
to test,what is the final result?
I get very low score, like that:
P@1: 0.4287878787878788 P@3: 0.38484848484848483 P@5: 0.3565656565656566 NDCG@1: 0.4287878787878788 NDCG@3: 0.3957641604407556 NDCG@5: 0.3747889067931902 pNDCG@1: 0.45656907373737377 pNDCG@3: 0.41984043916568803 pNDCG@5: 0.396182084153063
The text was updated successfully, but these errors were encountered: