Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

这里llama是用了那个版本的呀 #97

Open
Fuyu-xuan opened this issue Jun 21, 2024 · 4 comments
Open

这里llama是用了那个版本的呀 #97

Fuyu-xuan opened this issue Jun 21, 2024 · 4 comments

Comments

@Fuyu-xuan
Copy link

这里llama是用了哪个版本,llama1还是llama2

@Yukino821
Copy link

似乎用的是llama1,但是我在meta的hugging face页面上没有找到 llama1,只找到了2和3的。
然后我用了llama2-7b的并且对应使用了 vicuna-7b-v1.5。程序跑起来了,但是在本地网页上进行检测时一直显示Error,也没有输出日志。不知道你是否运行成功了,想问问是否有类似问题orz

@Fuyu-xuan
Copy link
Author

Fuyu-xuan commented Nov 6, 2024 via email

@Yukino821
Copy link

好的,谢谢你的回复

@Fruneng
Copy link

Fruneng commented Nov 18, 2024

Replaced the LLaMA models with https://huggingface.co/baffo32/decapoda-research-llama-7B-hf, and it runs well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants