You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I am trying to run tabby but I get:
WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:98: llama-server <chat> exited with status code 1, args: `Command { std: "//tabby_x86_64-manylinux2014-cuda122/llama-server" "-m" "/home/mte90/.tabby/models/TabbyML/Mistral-7B/ggml/model-00001-of-00001.gguf" "--cont-batching" "--port" "30892" "-np" "1" "--log-disable" "--ctx-size" "4096" "-ngl" "9999" "--chat-template" "<s>{% for message in messages %}{% if (message[\'role\'] == \'user\') != (loop.index0 % 2 == 0) %}{{ raise_exception(\'Conversation roles must alternate user/assistant/user/assistant/...\') }}{% endif %}{% if message[\'role\'] == \'user\' %}{{ \'[INST] \' + message[\'content\'] + \' [/INST]\' }}{% elif message[\'role\'] == \'assistant\' %}{{ message[\'content\'] + \'</s> \' }}{% else %}{{ raise_exception(\'Only user and assistant roles are supported!\') }}{% endif %}{% endfor %}", kill_on_drop: true }`
Information about your version
0.21
Ideally when there is this output tabby should exit and not still trying to start and generate the command to try with the same parameter or atleast a log output so it is possible to investigate as copy and paste that output doesn't work as it is escaped.
The text was updated successfully, but these errors were encountered:
Describe the bug
I am trying to run tabby but I get:
Information about your version
0.21
Ideally when there is this output tabby should exit and not still trying to start and generate the command to try with the same parameter or atleast a log output so it is possible to investigate as copy and paste that output doesn't work as it is escaped.
The text was updated successfully, but these errors were encountered: