Skip to content

Issues: Blaizzy/mlx-vlm

Batch Processing Feature
#40 opened Jun 11, 2024 by Blaizzy
Open 6
ChatUI improvements
#45 opened Jun 23, 2024 by Blaizzy
Open
Models to port to MLX-VLM
#39 opened Jun 11, 2024 by Blaizzy
Open 41
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Gemma 3 Chat UI Broken?
#243 opened Mar 12, 2025 by vessenes
Add FastAPI server enhancement New feature or request good first issue Good for newcomers
#241 opened Mar 12, 2025 by Blaizzy
Add support for Gemma 3?
#237 opened Mar 12, 2025 by alexgusevski
Support Phi-4-multimodal
#225 opened Mar 5, 2025 by kinfey
Running Siglip/Siglip2 on MLX?
#219 opened Feb 27, 2025 by maxlund
Negative padding bug Something isn't working
#214 opened Feb 23, 2025 by pavelgur
KeyError: 'image_token_index' bug Something isn't working
#213 opened Feb 23, 2025 by pavelgur
Add support for Ovis 2 ? enhancement New feature or request
#212 opened Feb 22, 2025 by alexgusevski
When will Janus-Pro be supported?
#204 opened Feb 18, 2025 by fackweb
temp or temperature?
#203 opened Feb 7, 2025 by asmeurer
Add AutoModel support
#193 opened Jan 30, 2025 by not-lain
Error in FineTuning deepseek-vl-7b-chat-8bit bug Something isn't working
#187 opened Jan 27, 2025 by sachinraja13
Chat UI from readme fails
#186 opened Jan 22, 2025 by andimarafioti
Fine Tuning Llava-Next
#181 opened Jan 13, 2025 by sachinraja13
ProTip! Type g p on any issue or pull request to go back to the pull request listing page.