Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Share of embeddings #4

Open
antonio-mastropaolo opened this issue Mar 19, 2021 · 0 comments
Open

Share of embeddings #4

antonio-mastropaolo opened this issue Mar 19, 2021 · 0 comments

Comments

@antonio-mastropaolo
Copy link

Hi all,

I was wondering what the benefits of sharing the word and projection weights when training a BLM model?
Do you think/suggest using it as default hyper-param when training the BLM model, or we're better off fine-tuning i?

Thank you all :)

@antonio-mastropaolo antonio-mastropaolo changed the title Embedding sharing Share of embeddings Mar 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant