Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

模型转换的时候是不是不能用量化过的模型 #437

Open
shum-elli opened this issue Mar 17, 2024 · 1 comment
Open

模型转换的时候是不是不能用量化过的模型 #437

shum-elli opened this issue Mar 17, 2024 · 1 comment

Comments

@shum-elli
Copy link

用的HuatuoII的模型,原版Hutuo II模型6b的参数版本基于baichuan2,用baichuan2的模型可以直接量化,但是官方量化后的版本貌似不能转换了,会报错,官方量化之后的int4的版本相比fastllm直接量化要小不少,可以有什么解决方案吗

@TylunasLi
Copy link
Contributor

目前仅支持少数几个模型(chatglm-6b-int4、chatglm2-6b-int4)通过llm.from_hf()的形式进行转换。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants