We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
用的HuatuoII的模型,原版Hutuo II模型6b的参数版本基于baichuan2,用baichuan2的模型可以直接量化,但是官方量化后的版本貌似不能转换了,会报错,官方量化之后的int4的版本相比fastllm直接量化要小不少,可以有什么解决方案吗
The text was updated successfully, but these errors were encountered:
目前仅支持少数几个模型(chatglm-6b-int4、chatglm2-6b-int4)通过llm.from_hf()的形式进行转换。
llm.from_hf()
Sorry, something went wrong.
No branches or pull requests
用的HuatuoII的模型,原版Hutuo II模型6b的参数版本基于baichuan2,用baichuan2的模型可以直接量化,但是官方量化后的版本貌似不能转换了,会报错,官方量化之后的int4的版本相比fastllm直接量化要小不少,可以有什么解决方案吗
The text was updated successfully, but these errors were encountered: