Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[Bugfix] Fix failing transformers dynamic module resolving with spawn multiproc method #13403

Merged
merged 5 commits into from
Feb 18, 2025

Conversation

Isotr0py
Copy link
Collaborator

@Isotr0py Isotr0py commented Feb 17, 2025

Issue discussion on Slack: https://vllm-dev.slack.com/archives/C07R5Q1Q2BB/p1739776343893149?thread_ts=1739553140.299949&cid=C07R5Q1Q2BB

  • transformers backend failed to load custom module on multiproc executor with VLLM_WORKER_MULTIPROC_METHOD=spawn because false-positive loaded custom module.
  • This PR optimize the automap resolving to make sure all custom modules initialized across processes

Signed-off-by: Isotr0py <2037008807@qq.com>
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

Signed-off-by: Isotr0py <2037008807@qq.com>
Copy link
Member

@hmellor hmellor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an elegant fix, I've just suggested a few modifications to reduce the level of nesting

Signed-off-by: Isotr0py <2037008807@qq.com>
Copy link
Member

@hmellor hmellor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just add a comment explaining why we need to create all the classes in auto_map (your original one got lost in my suggested refactor)

Signed-off-by: Isotr0py <2037008807@qq.com>
# executor.
auto_modules = {
name: get_class_from_dynamic_module(module, model_config.model)
for name, module in auto_map.items()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation depends on the order of entries in auto_map (eg. how they are iterated). It assumes that the Config class comes before the Modeling class that imports it. That will probably be true in most cases, but we could explicitly load AutoConfig first to be more robust to the ordering.

Another edge case would be if the custom code has relative imports for files that are not in the auto_map, but that seems uncommon...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation depends on the order of entries in auto_map

Only in the multiprocessing case though, right? Before this PR we never loaded the config class at all (and it is imported by the model class).

Copy link
Contributor

@tjohnson31415 tjohnson31415 Feb 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The config class is typically loaded early as part of the engine initialization when the ModelConfig is initialized (using transformers.AutoConfig). It is only in a spawned Worker that we lose the cached import and hit the case that this PR is trying to fix. In that error case, loading the modeling class before its dependencies is what raises the error. If the modeling class is first in auto_map.items(), then it will still fail after this PR.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can sort auto_map from keys to make sure that it has correct like:

{
    "AutoConfig": "<your-repo-name>--<config-name>",
    "AutoModel": "<your-repo-name>--<config-name>",
    "AutoModelFor<Task>": "<your-repo-name>--<config-name>",
}

Signed-off-by: Isotr0py <2037008807@qq.com>
@DarkLight1337 DarkLight1337 enabled auto-merge (squash) February 18, 2025 06:03
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Feb 18, 2025
@DarkLight1337 DarkLight1337 merged commit 8cf97f8 into vllm-project:main Feb 18, 2025
52 of 56 checks passed
@Isotr0py Isotr0py deleted the fix-transformers-tp branch February 18, 2025 10:26
panf2333 pushed a commit to yottalabsai/vllm that referenced this pull request Feb 18, 2025
… multiproc method (vllm-project#13403)

Signed-off-by: Isotr0py <2037008807@qq.com>
xjpang pushed a commit to xjpang/vllm that referenced this pull request Feb 20, 2025
… multiproc method (vllm-project#13403)

Signed-off-by: Isotr0py <2037008807@qq.com>
kerthcet pushed a commit to kerthcet/vllm that referenced this pull request Feb 21, 2025
… multiproc method (vllm-project#13403)

Signed-off-by: Isotr0py <2037008807@qq.com>
Akshat-Tripathi pushed a commit to krai/vllm that referenced this pull request Mar 3, 2025
… multiproc method (vllm-project#13403)

Signed-off-by: Isotr0py <2037008807@qq.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Mar 5, 2025
… multiproc method (vllm-project#13403)

Signed-off-by: Isotr0py <2037008807@qq.com>
Signed-off-by: Linkun Chen <github@lkchen.net>
Said-Akbar pushed a commit to Said-Akbar/vllm-rocm that referenced this pull request Mar 7, 2025
… multiproc method (vllm-project#13403)

Signed-off-by: Isotr0py <2037008807@qq.com>
Signed-off-by: saeediy <saidakbarp@gmail.com>
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants