You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Same question. Could you provide the best result with LLaMA 1 or LLaMA2 on Vicuna and give the hyperparameters (num of epoch/learning rate/lora_r/batch_size)?
I wanna reproduce your result ans use it as the baseline. Training loss or MMLU or alpaca_eval results are appreciated!
May I ask the result of this lora fine-tuning on MMLU task.
Thanks!
Best,
Lucas
The text was updated successfully, but these errors were encountered: