-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Discussion][Good First Issue]: Verify different LLMs work with text_generation #259
Comments
This should close #271 , I have updated the model list and added tests with reference from #259 --------- Signed-off-by: Lim, Kuan Xian <[email protected]>
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
#WLB |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
Most of the files linked here are not found, would be great if anyone could update it, thanks! |
Hello @kuanxian1, thanks for pointing our attention to it. @Wovchena I think the links are on |
I updated this issue description. |
Context
This is an effort to increase Large Language Models tests coverage in OpenVINO GenAI.
Working on this task will let you familiarize yourself with:
If you would like to add a new model which there's not a task for, please let us know! We would love to get outside ideas.
What needs to be done?
LICENSE
file or specify a license tag in its repo. For example, https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0/tree/main doesn't have the file, but has the tagLicense: apache-2.0
. Apache 2.0 and MIT licenses are compatible with OpenVINO.GenAI's license. For any other license, please, ask in the corresponding Good First Issue if the given license is compatible. If the model's repo doesn't have a license, contact the authors asking to add it by starting a new discussion at model's repo.Option 2 - setting paths to built OpenVINO GenAI artifacts manually
instead of cmake option because not every sample is installed yet.transformers
fails it's worth adding a test for the model anyway, just without comparison against python, leave a comment in workflow code if this is the case. Since default runners are available for everyone, one can verify the test passes by opening a PR in their fork first.nightly_models
listopenvino.genai/tests/python_tests/ov_genai_test_utils.py
Line 22 in c86fd77
nightly_models
list. Add this change to your pull request (PR).Example Pull Requests
Example commit: bf4c200#diff-2c8a6fc2893aa2e1103985c1ee763cc325de6042ea66a11ae30428d77e73e416
Resources
List of tasks in the effort
Expand me
Contact points
@Wovchena
Ticket
134074
The text was updated successfully, but these errors were encountered: