You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm developing a bug detection system but realized the metrics are slightly random. I noticed that this line is returning different values for a saved model in inference time:
Considering the encoder is an instance of T5Stack.
Any clues about it?
Thanks in advance.
The text was updated successfully, but these errors were encountered:
monilouise
changed the title
LoRA fine-tuning CodeT5+ generating random final encoder outputs in inference time
LoRA fine-tuned CodeT5+ generating random final encoder outputs in inference time
Jun 26, 2024
Hi,
I'm developing a bug detection system but realized the metrics are slightly random. I noticed that this line is returning different values for a saved model in inference time:
outputs = self.encoder.encoder(input_ids=input_ids, attention_mask=input_mask)
Considering the encoder is an instance of T5Stack.
Any clues about it?
Thanks in advance.
The text was updated successfully, but these errors were encountered: