-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
[BOUNTY - $100] Add support for LLaVA #3
Comments
taking a crack at this |
Pair programming? (Keep the bounty) |
What’s ur telegram? Discord?On 16 Jul 2024, at 17:38, Maciej Jankowski ***@***.***> wrote:
taking a crack at this
Pair programming?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: ***@***.***>
|
go for it guys. just leave a comment here if you need any help or run into issues :) @maciejjankowski @anirishrnbsinger |
Didn't make as much progress as would've hoped. Will be backed up with work for a couple days. Anyone else welcome to take a crack before then |
Interesting stuff. Wanted to read more about this but docs seem to be WIP. https://github.com/exo-explore/exo?tab=readme-ov-file#documentation |
Ain't this as straightforward as using the code from exo llama example and using it with mlx llava example? |
I can do pair programming, planning to do it on weekend 28th :) cc: @maciejjankowski |
cc: @AlexCheema And when can we expect the docs |
attempting this.. |
@AlexCheema hey, seems exo is not yet supported for windows platform? Is there any plan to support it in future. |
if yall still need help, would love to get involved. (Also do not care abt bounty and u can keep it) |
@khushgx looks like @varshith15 has started on it here: #88 |
im almost done sharding the model and removing duplicate code |
is there any thing I can contribute in ? @varshith15 |
This is completed. Thanks @varshith15 don’t forget to shoot me an email alex@exolabs.net to claim your bounty payout. |
@AlexCheema emailed you a couple of days ago, varshith15@gmail.com |
Motivation: Vision models are useful.
What: exo supports different inference engines. Choose an inference engine (currently only MLX and tinygrad) for your implementation. You can probably find an existing LLaVA implementation somewhere (perhaps in https://github.com/ml-explore/mlx-examples for MLX). Modify that implementation to support sharded inference.
Reward: $100 Bounty paid out with USDC on Ethereum, email alex@exolabs.net.
The text was updated successfully, but these errors were encountered: