Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I want to add a system prompt when I run the Llama3 test on my computer. #7053

Open
scj0709 opened this issue Nov 25, 2024 · 1 comment
Open
Labels
feature A request for a proper, new feature. llm: evaluation Perplexity, accuracy

Comments

@scj0709
Copy link

scj0709 commented Nov 25, 2024

🚀 The feature, motivation and pitch

https://github.com/pytorch/executorch/tree/main/examples/models/llama
I successfully executed the part 'Step 3: Run on your computer to validate' from the UR mentioned above. One thing I would like is to add a system prompt, but I only have an input for the prompt. Is there a way to do that?

Alternatives

No response

Additional context

I successfully executed the command cmake-out/examples/models/llama/llama_main --model_path= --tokenizer_path=<tokenizer.model> --prompt=, but I would like to add a separate system prompt!

RFC (Optional)

No response

@JacobSzwejbka
Copy link
Contributor

Hmm I definitely added this to torchchat, do we have this in the ET folders @larryliu0820 ?

@JacobSzwejbka JacobSzwejbka added feature A request for a proper, new feature. llm: evaluation Perplexity, accuracy labels Dec 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature A request for a proper, new feature. llm: evaluation Perplexity, accuracy
Projects
None yet
Development

No branches or pull requests

2 participants