You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks for your great work. I noticed that in your scripts, you hard-coded the lora alpha to be 128 and the rank r to be 4 (therefore leading to a scaling factor of 32):
Was there a principled justification for these choices? I am just wondering if you did any tuning on these values to suggest what would be good values to use.
The text was updated successfully, but these errors were encountered:
Hi, thanks for your great work. I noticed that in your scripts, you hard-coded the lora alpha to be 128 and the rank r to be 4 (therefore leading to a scaling factor of 32):
PEViT/vision_benchmark/evaluation/lora_model.py
Lines 455 to 463 in be6fb43
Was there a principled justification for these choices? I am just wondering if you did any tuning on these values to suggest what would be good values to use.
The text was updated successfully, but these errors were encountered: