-
Notifications
You must be signed in to change notification settings - Fork 878
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Additional parameters to mlx_lm lora? r, lora_alpha, lora_dropout, scale? #454
Comments
Yes that's been on our list to add for a while. Though, do you use |
I always us mlx_lm now, it's becoming more powerful at each release 💪 |
It would be great adding the new LR Scheduler or even the optimizer, but parameters become too complex then. |
Agree I think we may need to start using a yaml config |
#235 is dated. I can rebase it to mlx-examples/main and update it (to support the parameters that have been added since I last worked on that PR) if there is interest. I have found it useful to pull parameters from yaml and override them from what is provided via the command line. |
I too migrated to mlx_lm.lora and end up using a shell script generator. |
Yes indeed.. safe to close, thank you! |
When playing with fine-tuning sometimes I change from_linear in lora.py to play with them.
Should we add command line args for these?
The text was updated successfully, but these errors were encountered: