Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A bug while setting weight_factor #10

Closed
memray opened this issue Jul 2, 2019 · 3 comments
Closed

A bug while setting weight_factor #10

memray opened this issue Jul 2, 2019 · 3 comments

Comments

@memray
Copy link

memray commented Jul 2, 2019

Hi @Diego999 ,

I found that if I set a different weight_factor, say change it from 1.0 to 1.2, both rouge-l and rouge-w will change. And the rouge-l given w=1.2 is hugely different from perl version, but with w=1.0 it's close. If I understand correctly, this behavior should cause no effect to rouge-l (no reweighting). I guess it's a bug in rouge-l implementation, though I haven't located the error.

And I was trying to compare the score of your implementation with the Perl version. Per my observation, the scores are very similar (I can see slight differences). Good implementation!

Thanks,
Rui

@memray
Copy link
Author

memray commented Jul 2, 2019

Change line 614 and line 625
from score = Rouge._compute_p_r_f_score(total_hypothesis_ngrams_count, total_reference_ngrams_count, total_ngrams_overlapping_count, self.alpha, self.weight_factor)
to score = Rouge._compute_p_r_f_score(total_hypothesis_ngrams_count, total_reference_ngrams_count, total_ngrams_overlapping_count, self.alpha, self.weight_factor if use_w else 1.0)

will fix it.

@Diego999
Copy link
Owner

Hi @memray

Thank you for your message. A pull request is welcome :-)

@Diego999
Copy link
Owner

Diego999 commented Jul 10, 2019

thanks !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants