We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Interestingly we may be able to implement C optimized version of Pygmalion6B into our project
The text was updated successfully, but these errors were encountered:
does it require 12gb RAM or VRAM?
Sorry, something went wrong.
Just RAM not VRAM for CPU inference it worked perfectly on 16GB RAM
but 12GB should be sufficient (in case it isn't please tell me) still I recommend 16GB RAM
as for GPU inference(Which will be faster obviously) I didn't do it yet since I didn't have GPU with enough VRAM to try it out
What is missing to run it on a gpu assuming you had a few 24g m40 gpus laying around
Notting's missing it's just a few lines of code I just can't test it :D I'll just do some update today then :b
No branches or pull requests
Interestingly we may be able to implement C optimized version of Pygmalion6B into our project
The text was updated successfully, but these errors were encountered: