Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LlmInputs - Updated API + generic framework + convert to vLLM format #486

Merged
merged 8 commits into from
Mar 5, 2024

Conversation

nv-braf
Copy link
Contributor

@nv-braf nv-braf commented Mar 5, 2024

I've updated the API, added a generic JSON framework to store the intermediate data and added methods to convert to vLLM format.

Here's an example output of HF input to vLLM output. Both role text strings (system/user) are concatenated to form the text input.

{
  "data": [
    {
      "text_input": [
        "You are an AI assistant. You will be given a task. You must generate a detailed and long answer.",
        "Generate an approximately fifteen-word sentence that describes all this data: Midsummer House eatType restaurant; Midsummer House food Chinese; Midsummer House priceRange moderate; Midsummer House customer rating 3 out of 5; Midsummer House near All Bar One"
      ],
      "stream": [
        true
      ]
    },
]
}


OPEN_ORCA = "openorca"
CNN_DAILY_MAIL = "cnn_dailymail"
DEFAULT_INPUT_DATA_JSON = "./llm_inputs.json"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

had to remove the "./" in my branch. You will see a slight merge conflict without that update.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed to match and am now using this define inside the class

from genai_pa.exceptions import GenAiPAException
from requests import Response


class InputType(Enum):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

@nv-braf nv-braf merged commit 9444bb1 into feature-genai-pa Mar 5, 2024
3 checks passed
@nv-braf nv-braf deleted the llm-inputs-vllm-output branch March 5, 2024 20:57
debermudez pushed a commit that referenced this pull request Mar 12, 2024
…486)

* Initial API changes. Unit tests passing

* Creating and using generic format with openai

* Initial code to support vllm output

* Refactoring

* General cleanup + todos

* Fixing codeql

* Fix codeql issue

* Removing output_filename define
debermudez pushed a commit that referenced this pull request Mar 13, 2024
…486)

* Initial API changes. Unit tests passing

* Creating and using generic format with openai

* Initial code to support vllm output

* Refactoring

* General cleanup + todos

* Fixing codeql

* Fix codeql issue

* Removing output_filename define
mc-nv pushed a commit that referenced this pull request Mar 13, 2024
…486)

* Initial API changes. Unit tests passing

* Creating and using generic format with openai

* Initial code to support vllm output

* Refactoring

* General cleanup + todos

* Fixing codeql

* Fix codeql issue

* Removing output_filename define
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants