We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问一下MiniCPM3微调的时候可以按照minicpm-2b微调的方式吗,还是说只能按照你们给的用llama-factory的方式
The text was updated successfully, but these errors were encountered:
I also have this question. Fine tuning with loRA can lead to errors
ValueError: Target modules {'v_proj', 'q_proj'} not found in the base model. Please check the target modules and try again.
Sorry, something went wrong.
你好,这一块我们正在适配,请先使用llamafactory吧
@LDLINGLINGLING 请问什大概什么时候可以适配完
No branches or pull requests
请问一下MiniCPM3微调的时候可以按照minicpm-2b微调的方式吗,还是说只能按照你们给的用llama-factory的方式
The text was updated successfully, but these errors were encountered: