Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

没有支持最新的llama.cpp的格式吗 #38

Open
AceyKubbo opened this issue Jun 16, 2023 · 0 comments
Open

没有支持最新的llama.cpp的格式吗 #38

AceyKubbo opened this issue Jun 16, 2023 · 0 comments

Comments

@AceyKubbo
Copy link

我使用llama.cpp量化过的alpaca7b_plus,看样子并没有支持

llama_model_load_internal: format     = ggjt v3 (latest)
main: seed = 1686927440
Assert ' 0 ' failed at file : /mnt/e/pyCode/InferLLM/src/graph/llama.cpp
line 37 : virtual void inferllm::LlamaGraph::load(std::shared_ptr<inferllm::InputFile>, inferllm::LlmParams&, std::shared_ptr<inferllm::Vocab>),
extra message: unsupported model type.Aborted
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant