We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我使用llama.cpp量化过的alpaca7b_plus,看样子并没有支持
llama_model_load_internal: format = ggjt v3 (latest)
main: seed = 1686927440 Assert ' 0 ' failed at file : /mnt/e/pyCode/InferLLM/src/graph/llama.cpp line 37 : virtual void inferllm::LlamaGraph::load(std::shared_ptr<inferllm::InputFile>, inferllm::LlmParams&, std::shared_ptr<inferllm::Vocab>), extra message: unsupported model type.Aborted
The text was updated successfully, but these errors were encountered:
No branches or pull requests
我使用llama.cpp量化过的alpaca7b_plus,看样子并没有支持
The text was updated successfully, but these errors were encountered: