We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I have been trying to get llama 2 models to function correctly. They start off ok but then all of them goes into a loop with repetitions or gibberish.
I haven't tried setting model_type:llama to something else, could it be that we need to add llama2 here instead?
model_type: llama
Possible to get any of the code llms to support this ?
The text was updated successfully, but these errors were encountered:
I tried with llama-2 and llama2 and read the ctransformers documentation and realized its just llama.
The answer gets into a loop when using llama2 models:
The telecom industry is not not not not not not not not not not not not not not not
Like that, I read somewhere that it could be related to something RoPE but don't know how to set that!
Sorry, something went wrong.
Fixed it by implementing prompt template!
No branches or pull requests
I have been trying to get llama 2 models to function correctly. They start off ok but then all of them goes into a loop with repetitions or gibberish.
I haven't tried setting model_type:llama to something else, could it be that we need to add llama2 here instead?
model_type: llama
Possible to get any of the code llms to support this ?
The text was updated successfully, but these errors were encountered: