Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[New Model]: Qwen2.5-VL #12486

Closed
1 task done
yumlevi opened this issue Jan 27, 2025 · 5 comments · Fixed by #12604 · May be fixed by HabanaAI/vllm-fork#870
Closed
1 task done

[New Model]: Qwen2.5-VL #12486

yumlevi opened this issue Jan 27, 2025 · 5 comments · Fixed by #12604 · May be fixed by HabanaAI/vllm-fork#870
Assignees
Labels
new model Requests to new models

Comments

@yumlevi
Copy link

yumlevi commented Jan 27, 2025

The model to consider.

new model family dropped

https://huggingface.co/collections/Qwen/qwen25-vl-6795ffac22b334a837c0f9a5

The closest model vllm already supports.

https://huggingface.co/collections/Qwen/qwen2-vl-66cee7455501d7126940800d

What's your difficulty of supporting the model you want?

No response

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@yumlevi yumlevi added the new model Requests to new models label Jan 27, 2025
@ywang96 ywang96 self-assigned this Jan 28, 2025
@hmellor hmellor marked this as a duplicate of #12502 Jan 28, 2025
@hmellor hmellor mentioned this issue Jan 28, 2025
2 tasks
@DarkLight1337
Copy link
Member

@fyabc is your team planning to open a PR for this?

@zifeitong
Copy link
Contributor

zifeitong commented Jan 30, 2025

modular_transformers makes it a bit difficult to see the difference between Qwen2-VL and Qwen2.5-VL.

Here is a diff of transformers implementation with some reshuffling, modeling part only:
https://gist.github.com/zifeitong/1028fca68bbe4be45d2000c6dd16c157

The differences in processing, image_processing and config, is minimal.

@CuriousCat-7
Copy link

So, will it be supported in the future?

@DarkLight1337
Copy link
Member

It is already supported in vLLM, please update your vLLM version.

@CuriousCat-7
Copy link

It is already supported in vLLM, please update your vLLM version.

I've updated the latest vllm (0.7.3), transformers, so on, and find that it will get error when I use the openai api format, the details are here:
QwenLM/Qwen2.5-VL#887

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
new model Requests to new models
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants