Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

大模型推理中这个推理引擎如何支持 lora,ptuning等私有语料训练插件后的新模型 #53

Open
leoluopy opened this issue Jul 15, 2023 · 2 comments

Comments

@leoluopy
Copy link

如题:
大模型推理中这个推理引擎如何支持 lora,ptuning等私有语料训练插件后的新模型

有没有guide ?

@chenqy4933
Copy link
Collaborator

If you finetune with lora,you should merge the weights first,then inference as normal,there are many document in Open Source,you can search it

@leoluopy
Copy link
Author

@chenqy4933
thanks for replying , and is there any guide for ptuning ( ie. prefix add family method)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants