-
Sorry for the dumb question- does today’a 1.15.0 release bring support for the v3 model formats such as Manticore. I believe it should but I’m having bad magic errors when attempting to load. I have updated the python package llama-cop-python) and been able to load the model that way successfully, just not with LocalAI |
Beta Was this translation helpful? Give feedback.
Answered by
mudler
May 24, 2023
Replies: 1 comment 1 reply
-
Please don't post any model which license is not permissive. Manticore here works:
I've installed it with: curl $LOCALAI/models/apply -H "Content-Type: application/json" -d '{
"url": "github:go-skynet/model-gallery/manticore.yaml",
"name": "gpt-3.5-turbo",
"overrides": { "parameters": {"model": "Manticore-13B.ggmlv3.q5_1.bin" }, "f16": true },
"files": [
{
"uri": "xxxxx/resolve/main/Manticore-13B.ggmlv3.q5_1.bin",
"sha256": "7d2c76516bcfdedc0d6282e3c352e2423964989fc871e21b1922f0f1b8acc1db",
"filename": "Manticore-13B.ggmlv3.q5_1.bin"
}
]
}' |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
watcher60
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Please don't post any model which license is not permissive.
Manticore here works: