Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[Core] Make raw_request optional in ServingCompletion #12503

Conversation

schoennenbeck
Copy link
Contributor

@schoennenbeck schoennenbeck commented Jan 28, 2025

This PR simply ports the logic that is already used in OpenAIServingChat to OpenAIServingCompletion to make passing the raw_request optional.

Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

@mergify mergify bot added the frontend label Jan 28, 2025
Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
@schoennenbeck schoennenbeck force-pushed the feature/make-raw_request-optional branch from ee17c2f to e2a9613 Compare January 28, 2025 07:17
@DarkLight1337
Copy link
Member

Could you elaborate on the use case for this?

@schoennenbeck
Copy link
Contributor Author

Could you elaborate on the use case for this?

  1. Simple feature parity between all the different OpenAIServingX-implementations.
  2. This is handy in cases where you want to use the OpenAIServingCompletion but due to the nature of your application do not have access to the raw_request. E.g. if you have your ingress and your model in separate ray deployments you can only send cloud-picklable objects between the two and the raw_request is not one of them.

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, thanks for adding and explaining this!

@DarkLight1337 DarkLight1337 added the ready ONLY add when PR is ready to merge/full CI is needed label Jan 28, 2025
@DarkLight1337
Copy link
Member

I'll enable auto-merge once you fix the typo.

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
@DarkLight1337 DarkLight1337 enabled auto-merge (squash) January 28, 2025 08:47
@schoennenbeck
Copy link
Contributor Author

I'll enable auto-merge once you fix the typo.

Thanks for the super quick turn around!

@DarkLight1337 DarkLight1337 merged commit 2079e43 into vllm-project:main Jan 28, 2025
46 checks passed
rasmith pushed a commit to rasmith/vllm that referenced this pull request Jan 30, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
Isotr0py pushed a commit to Isotr0py/vllm that referenced this pull request Feb 2, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
Signed-off-by: Isotr0py <2037008807@qq.com>
NickLucche pushed a commit to NickLucche/vllm that referenced this pull request Feb 7, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
ShangmingCai pushed a commit to ShangmingCai/vllm that referenced this pull request Feb 10, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
GWS0428 pushed a commit to GWS0428/VARserve that referenced this pull request Feb 12, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
panf2333 pushed a commit to yottalabsai/vllm that referenced this pull request Feb 18, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
kerthcet pushed a commit to kerthcet/vllm that referenced this pull request Feb 21, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Mar 5, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
Signed-off-by: Linkun Chen <github@lkchen.net>
Said-Akbar pushed a commit to Said-Akbar/vllm-rocm that referenced this pull request Mar 7, 2025
…2503)

Signed-off-by: Sebastian Schönnenbeck <sebastian.schoennenbeck@comma-soft.com>
Signed-off-by: saeediy <saidakbarp@gmail.com>
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
frontend ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants