Skip to content
This repository has been archived by the owner on May 12, 2023. It is now read-only.

Use Facebook LLaMa model with pyllamacpp? #59

Closed Answered by VoxanyNet
VoxanyNet asked this question in Q&A
Discussion options

You must be logged in to vote

I'm sorry I just found the problem. I was using the 30B model which requires over 20GB of memory. I have 32GB on my system but I believe there wasn't enough left over. Using the 13B model works for me.

Thanks for the response!

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
1 reply
@VoxanyNet
Comment options

Answer selected by VoxanyNet
Comment options

You must be logged in to vote
1 reply
@VoxanyNet
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants