This repository has been archived by the owner on May 12, 2023. It is now read-only.
[Question/Improvement]Add Save/Load binding from llama.cpp #56
Labels
enhancement
New feature or request
First I want to say I really enjoy this binding, it's working as expected, useful and simple etc. etc.
But I would say I am missing one very crucial feature - save and load the model state to file. For example I want to ask the agent for improvement suggestions, and then use his suggestion as part of a prompt. All this means I want to save and load model state, and not run it all from the start again.
I've checked the llama repository and they had this issue and they closed it, as if it was solved.
I used chatGPT to write me save/load function based on their comments. Now I want to add it to llama, and then to here.
The question here would be: how would I add this functionality from llama.cpp to this binding?
P.S. Another small suggestion regardless of llama binding is the option to have a call_back function that would stop the generation, for example when ### Human / ### Instructions appears in the response.
The text was updated successfully, but these errors were encountered: