Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(Question) How to set up MOE? #73

Open
MiniPhantom opened this issue Jun 15, 2024 · 0 comments
Open

(Question) How to set up MOE? #73

MiniPhantom opened this issue Jun 15, 2024 · 0 comments

Comments

@MiniPhantom
Copy link

Is mixture of experts just limited to Mixtral, or can smaller models such as Qwen’s moe be run too? If so how to set it up, as there are no visible options.

Anyways thanks for your amazing work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant