-
Notifications
You must be signed in to change notification settings - Fork 234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LlmInputs - Updated API + generic framework + convert to vLLM format #486
Conversation
|
||
OPEN_ORCA = "openorca" | ||
CNN_DAILY_MAIL = "cnn_dailymail" | ||
DEFAULT_INPUT_DATA_JSON = "./llm_inputs.json" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
had to remove the "./" in my branch. You will see a slight merge conflict without that update.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed to match and am now using this define inside the class
from genai_pa.exceptions import GenAiPAException | ||
from requests import Response | ||
|
||
|
||
class InputType(Enum): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice!
…486) * Initial API changes. Unit tests passing * Creating and using generic format with openai * Initial code to support vllm output * Refactoring * General cleanup + todos * Fixing codeql * Fix codeql issue * Removing output_filename define
…486) * Initial API changes. Unit tests passing * Creating and using generic format with openai * Initial code to support vllm output * Refactoring * General cleanup + todos * Fixing codeql * Fix codeql issue * Removing output_filename define
…486) * Initial API changes. Unit tests passing * Creating and using generic format with openai * Initial code to support vllm output * Refactoring * General cleanup + todos * Fixing codeql * Fix codeql issue * Removing output_filename define
I've updated the API, added a generic JSON framework to store the intermediate data and added methods to convert to vLLM format.
Here's an example output of HF input to vLLM output. Both role text strings (system/user) are concatenated to form the text input.