Replies: 1 comment 1 reply
-
Did you found how to do this? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I recently found this project and want to run LLaMA-33B-HF. I saw that you need the vLLM Backend for that.
I have the following file structure:
And this is the Content of the
LLaMA-33B-HF.yaml
file:But whenever I try to run the model, I get the following Error:
Beta Was this translation helpful? Give feedback.
All reactions