Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weights for finetuned models? #5

Open
jeevster opened this issue Nov 5, 2024 · 2 comments
Open

Weights for finetuned models? #5

jeevster opened this issue Nov 5, 2024 · 2 comments

Comments

@jeevster
Copy link

jeevster commented Nov 5, 2024

Is there any plan to release the weights for the JMP models finetuned on the various data subsets (e.g MD22)? I see that it is possible to finetune from the provided pretrained checkpoints using this repo, but having the checkpoints would be nice to avoid having to do this myself.

@nimashoghi
Copy link
Collaborator

nimashoghi commented Nov 5, 2024

I don't think these checkpoints were reserved (as we had 72 different fine-tuning runs, when you take all 5 seeds of MatBench into account), but I'm not 100% sure. @wood-b would have more info on this.

@jeevster
Copy link
Author

jeevster commented Nov 6, 2024

Ok thanks, please keep me posted. In the meantime, if I do end up finetuning the models myself, I'd like to have the option of finetuning JMP-S in addition to JMP-L. However, I currently only see a config for JMP-L finetuning (src/jmp/configs/finetune/jmp_l.py), and same for the individual datasets (e.g for MD22, there is a jmp_l_md22_config_ in src/jmp/configs/finetune/md22.py but nothing corresponding to JMP-S). Would it be possible to share the JMP-S configs as well?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants