-
Notifications
You must be signed in to change notification settings - Fork 234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLM Inputs Library - MVP #481
Conversation
if model_name: | ||
pa_json["data"][0]["payload"][index]["model"] = model_name | ||
if add_streaming: | ||
pa_json["data"][0]["payload"][index]["streaming"] = "true" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I told you wrong. It is supposed to be "stream", not "streaming"
https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
DEFAULT_LENGTH = 100 | ||
MINIMUM_LENGTH = 1 | ||
|
||
EMPTY_JSON_IN_PA_FORMAT = {"data": [{"payload": []}]} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI the payload key is only true for OpenAI
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed name to reflect this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks fine for HuggingFace->OpenAI format. There will need to be some refactoring in order to support the following:
HuggingFace->Triton_Trtllm
Synthetic->OpenAI
Synthetic->Triton_Trtllm
We probably also need Triton_vllm. I think that's 99% the same as Triton_trtllm though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for all the work here.
Think this is good to go in now.
* Initial code for LLmInputs methods. All unit tests passing. * Ensure file is deleted and general cleanup * Adding in missing parameter descriptions * Removing uneeded import * Fixing precommit issues * Fixing codeQL issue * Fixing more codeQL issues * Removing datasets dependency * Changes based on Elias' review * Fixing precommit error * Fixing names
* Initial code for LLmInputs methods. All unit tests passing. * Ensure file is deleted and general cleanup * Adding in missing parameter descriptions * Removing uneeded import * Fixing precommit issues * Fixing codeQL issue * Fixing more codeQL issues * Removing datasets dependency * Changes based on Elias' review * Fixing precommit error * Fixing names
* Initial code for LLmInputs methods. All unit tests passing. * Ensure file is deleted and general cleanup * Adding in missing parameter descriptions * Removing uneeded import * Fixing precommit issues * Fixing codeQL issue * Fixing more codeQL issues * Removing datasets dependency * Changes based on Elias' review * Fixing precommit error * Fixing names
Adds a library with a single method
create_openai_llm_inputs
.Unit testing is in place to cover private methods as well as an end-to-end test that checks both open_orca and cnn_dailymail.