Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

predict_win and predict_rank do not work on 2x2 and more games #124

Closed
JustRoxy opened this issue Feb 3, 2024 · 1 comment · Fixed by #125
Closed

predict_win and predict_rank do not work on 2x2 and more games #124

JustRoxy opened this issue Feb 3, 2024 · 1 comment · Fixed by #125
Labels
bug Something isn't working

Comments

@JustRoxy
Copy link

JustRoxy commented Feb 3, 2024

Describe the bug
predict_win and predict_rank do not work properly on 3x3x3 games

To Reproduce
Step 1:

from openskill.models import PlackettLuce

model = PlackettLuce()

p1 = model.rating(mu=34, sigma=0.25)
p2 = model.rating(mu=34, sigma=0.25)
p3 = model.rating(mu=34, sigma=0.25)

p4 = model.rating(mu=32, sigma=0.5)
p5 = model.rating(mu=32, sigma=0.5)
p6 = model.rating(mu=32, sigma=0.5)

p7 = model.rating(mu=30, sigma=1)
p8 = model.rating(mu=30, sigma=1)
p9 = model.rating(mu=30, sigma=1)

team1, team2, team3 = [p1, p2, p3], [p4, p5, p6], [p7, p8, p9]

r = model.predict_win([team1, team2, team3])
print(r)

Results in:
[0.439077174955099, 0.3330210112526078, 0.2279018137922932]

Step 2, change p9 mu to 40:

from openskill.models import PlackettLuce

model = PlackettLuce()

p1 = model.rating(mu=34, sigma=0.25)
p2 = model.rating(mu=34, sigma=0.25)
p3 = model.rating(mu=34, sigma=0.25)

p4 = model.rating(mu=32, sigma=0.5)
p5 = model.rating(mu=32, sigma=0.5)
p6 = model.rating(mu=32, sigma=0.5)

p7 = model.rating(mu=30, sigma=1)
p8 = model.rating(mu=30, sigma=1)
p9 = model.rating(mu=40, sigma=1)

team1, team2, team3 = [p1, p2, p3], [p4, p5, p6], [p7, p8, p9]

print([team1, team2, team3])
r = model.predict_win([team1, team2, team3])
print(r)

Results are the same:
[0.439077174955099, 0.3330210112526078, 0.2279018137922932]

Expected behavior
After p9 mu increase team3 are expected to have a bigger chance of victory

Platform Information

  • openskill.py Version: 5.1.0

Additional context
https://github.com/OpenDebates/openskill.py/blob/f76df19c3e388f31050c988a0059367bd1dadc76/openskill/models/weng_lin/bradley_terry_full.py#L765

I have no idea what is going on here, and why it selects rating only of the first player, but it just does not work as intended

@vivekjoshy vivekjoshy added the bug Something isn't working label Feb 3, 2024
@vivekjoshy
Copy link
Owner

It seems I forgot to copy over the original code before the API overhaul. Amazing how it went unnoticed for so long.

vivekjoshy added a commit that referenced this issue Feb 3, 2024
Signed-off-by: Vivek Joshy <8206808+vivekjoshy@users.noreply.github.com>
@vivekjoshy vivekjoshy mentioned this issue Feb 3, 2024
2 tasks
vivekjoshy added a commit that referenced this issue Feb 3, 2024
* Fixed #124

Signed-off-by: Vivek Joshy <8206808+vivekjoshy@users.noreply.github.com>

* Update dependencies

Signed-off-by: Vivek Joshy <8206808+vivekjoshy@users.noreply.github.com>

* Format with black

Signed-off-by: Vivek Joshy <8206808+vivekjoshy@users.noreply.github.com>

* Add changelog fragment

Signed-off-by: Vivek Joshy <8206808+vivekjoshy@users.noreply.github.com>

---------

Signed-off-by: Vivek Joshy <8206808+vivekjoshy@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants