Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Add support for logprobs to the mlx_lm server #791

Open
chimezie opened this issue May 21, 2024 · 3 comments
Open

[Feature Request] Add support for logprobs to the mlx_lm server #791

chimezie opened this issue May 21, 2024 · 3 comments

Comments

@chimezie
Copy link
Contributor

I have begun a PR request to lm-evaluation-harness to support MLX models, but have become bogged down by details regarding many things that are already implemented in mlx_lm. As I have become more familiar with that framework, it seems all we really need to be able to produce evaluations of MLX models over OpenAI is to add support for logprobs to the current server infrastructure.

This would greatly benefit the feedback loop of using MLX to train and evaluating them in a semi-standard way against other models.

I would be happy to contribute a PR, but just need some pointers regarding the relationship between Log probabilities of output tokens (i.e,. the "likelihood of each token occurring in the sequence given the context. To simplify, a logprob is log(p), where p = probability of a token occurring at a specific position based on the previous tokens in the context.") and the token probabilities we already return from mlx_lm.utils.generate_step

@awni
Copy link
Member

awni commented May 22, 2024

The probabilities that gets returned is the probability of the given token at that time step. To get the log probabilities you would just take the log of it mx.log(p).

@awni
Copy link
Member

awni commented May 22, 2024

I would be happy to contribute a PR,

Thanks! It would be great to ease the path to evaluating models in MLX.

@chimezie
Copy link
Contributor Author

Sounds good. I will contribute a PR for the server. Thanks for the input.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants