We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
感谢之前的工作: Llama.cpp 、 Alpaca.cpp, 请注意
update readme 1.首先,你需要将你的lora参数与原始模型合并,并将它们转换为ggml格式,用于cpp推理。
ggml
merge changes for cpp inference
bash prepare_llama_cpp.sh
update readme ( 在我们的代码中,首先将hf模型和lora转换为合并的consolidated.0x.pth,其中x对应num_shards,并将它们转换为ggml-model-f16.bin。 )
consolidated.0x.pth
x
ggml-model-f16.bin
执行到这一步后
可以将hf模型和lora合并。我想请教一下,如果是多个lora模型,是否可以合并,该怎么操作?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
使用纯C++推理
感谢之前的工作: Llama.cpp 、 Alpaca.cpp, 请注意
update readme
1.首先,你需要将你的lora参数与原始模型合并,并将它们转换为
ggml
格式,用于cpp推理。merge changes for cpp inference
update readme
( 在我们的代码中,首先将hf模型和lora转换为合并的
consolidated.0x.pth
,其中x
对应num_shards,并将它们转换为ggml-model-f16.bin
。 )执行到这一步后
可以将hf模型和lora合并。我想请教一下,如果是多个lora模型,是否可以合并,该怎么操作?
The text was updated successfully, but these errors were encountered: