You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there any plan to release the weights for the JMP models finetuned on the various data subsets (e.g MD22)? I see that it is possible to finetune from the provided pretrained checkpoints using this repo, but having the checkpoints would be nice to avoid having to do this myself.
The text was updated successfully, but these errors were encountered:
I don't think these checkpoints were reserved (as we had 72 different fine-tuning runs, when you take all 5 seeds of MatBench into account), but I'm not 100% sure. @wood-b would have more info on this.
Ok thanks, please keep me posted. In the meantime, if I do end up finetuning the models myself, I'd like to have the option of finetuning JMP-S in addition to JMP-L. However, I currently only see a config for JMP-L finetuning (src/jmp/configs/finetune/jmp_l.py), and same for the individual datasets (e.g for MD22, there is a jmp_l_md22_config_ in src/jmp/configs/finetune/md22.py but nothing corresponding to JMP-S). Would it be possible to share the JMP-S configs as well?
Is there any plan to release the weights for the JMP models finetuned on the various data subsets (e.g MD22)? I see that it is possible to finetune from the provided pretrained checkpoints using this repo, but having the checkpoints would be nice to avoid having to do this myself.
The text was updated successfully, but these errors were encountered: