Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

predict-prob always giving 1.00001 as a result for any input #925

DiegoZoracKy opened this issue Oct 9, 2019 · 2 comments


Copy link

@DiegoZoracKy DiegoZoracKy commented Oct 9, 2019

Although the results given from a train/test split strategy shows up a high accuracy for the case (~90%), it seems that something is going wrong on predict-prob method as it always answers 1.00001 for any input, even for nonsense ones like "haha", "fasttext", "1234567890".

Training params:
-lr 1.0 -epoch 25 -wordNgrams 2

Training data:
Read 243M words
Number of words: 2273553
Number of labels: 1427

It's worth noting I've trained other models and this is the only case where I could find this weirdness until now.


This comment has been minimized.

Copy link

@Celebio Celebio commented Oct 10, 2019

Hi @DiegoZoracKy ,
Thank you for the feedback.

What do you get when you copy/paste some text from your test set to predict-prob?



This comment has been minimized.

Copy link

@DiegoZoracKy DiegoZoracKy commented Oct 10, 2019

Same value. It just gives 1.00001 for any input.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
2 participants
You can’t perform that action at this time.