Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Support sending additional outputs from vLLM inference #70

Merged
merged 15 commits into from
Nov 26, 2024

Conversation

kthui
Copy link
Contributor

@kthui kthui commented Nov 2, 2024

What does the PR do?

Add support for sending additional outputs from vLLM. At this step, the following 3 outputs are added:

  • finish reason
  • cumulative log probabilities
  • number of token ids

Checklist

  • PR title reflects the change and is of format <commit_type>: <Title>
  • Changes are described in the pull request.
  • Related issues are referenced.
  • Populated github labels field
  • Added test plan and verified test passes.
  • Verified that the PR passes existing CI.
  • Verified copyright is correct on all changed files.
  • Added succinct git squash message before merging ref.
  • All template sections are filled out.
  • Optional: Additional screenshots for behavior/output changes with before/after.

Commit Type:

Check the conventional commit type
box here and add the label to the github PR.

  • build
  • ci
  • docs
  • feat
  • fix
  • perf
  • refactor
  • revert
  • style
  • test

Related PRs:

N/A

Where should the reviewer start?

N/A

Test plan:

A new test is added with this PR for all combinations on setting the 3 additional outputs and verifying the outputs are valid for each combination.

  • CI Pipeline ID: 20796825

Caveats:

N/A

Background

Outputs supported by vLLM in addition to text output: https://github.com/vllm-project/vllm/blob/v0.6.3.post1/vllm/outputs.py#L14-L40

Related Issues: (use one of the action keywords Closes / Fixes / Resolves / Relates to)

N/A

kthui added 3 commits October 31, 2024 19:03
* [WIP] Add additional outputs to auto complete

* [WIP] Use individual input tensor to control per additional output

* [WIP] Parse additional output flags from request
@kthui kthui force-pushed the jacky-vllm-additional-outputs branch from 264d387 to 9fc7d0b Compare November 2, 2024 00:21
@kthui kthui self-assigned this Nov 2, 2024
* Add additional outputs test

* Update copyright

* Some test enhancement and notes
@kthui kthui force-pushed the jacky-vllm-additional-outputs branch from 9fc7d0b to 5e605ca Compare November 2, 2024 04:04
@kthui kthui added the PR: feat A new feature label Nov 4, 2024
@kthui kthui changed the title Support sending additional outputs from vLLM inference feat: Support sending additional outputs from vLLM inference Nov 4, 2024
@kthui kthui marked this pull request as ready for review November 4, 2024 22:53
rmccorm4
rmccorm4 previously approved these changes Nov 7, 2024
krishung5
krishung5 previously approved these changes Nov 7, 2024
@kthui kthui dismissed stale reviews from krishung5 and rmccorm4 via 2b531dd November 25, 2024 23:40
@kthui kthui requested review from krishung5 and rmccorm4 November 26, 2024 01:50
@kthui kthui merged commit ceb5961 into main Nov 26, 2024
3 checks passed
@kthui kthui deleted the jacky-vllm-additional-outputs branch November 26, 2024 03:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR: feat A new feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants