Skip to content

Issues: Blaizzy/mlx-vlm

Batch Processing Feature
#40 opened Jun 11, 2024 by Blaizzy
Open 4
ChatUI improvements
#45 opened Jun 23, 2024 by Blaizzy
Open
Models to port to MLX-VLM
#39 opened Jun 11, 2024 by Blaizzy
Open 20
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

LLaVa-NeXT needs tweaking for v4.47
#106 opened Oct 25, 2024 by jrp2014
gradio chat exception
#103 opened Oct 23, 2024 by faev999
Support for SDXL 3.5
#101 opened Oct 22, 2024 by bhupesh-sf
Support for Null Image input
#94 opened Oct 18, 2024 by dtoconnor
Nan loss when training Llama-3.2-vision enhancement New feature or request
#84 opened Oct 13, 2024 by Blaizzy
Which specific models work with this framework? enhancement New feature or request
#80 opened Oct 11, 2024 by jrp2014
Please add support for Molmo
#70 opened Oct 2, 2024 by xSNYPSx
ChatUI improvements good first issue Good for newcomers
#45 opened Jun 23, 2024 by Blaizzy
3 tasks
Batch Processing Feature good first issue Good for newcomers
#40 opened Jun 11, 2024 by Blaizzy
Models to port to MLX-VLM good first issue Good for newcomers
#39 opened Jun 11, 2024 by Blaizzy
10 of 26 tasks
ProTip! Type g i on any issue or pull request to go back to the issue listing page.