Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removal of multihead attention from activation #13

Open
dreamer2368 opened this issue Oct 23, 2024 · 0 comments
Open

Removal of multihead attention from activation #13

dreamer2368 opened this issue Oct 23, 2024 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@dreamer2368
Copy link
Collaborator

Per @punkduckable , multihead attention should be implemented as a layer, not an activation function. However, the current implementation simply uses multihead attention as an activation function, which also disrupts overall structure within MultiLayerPerceptron.

multihead attention should be removed from activation, and probably implemented as a derived class of latent space.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants