You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In line train.py:100, The "loss.backward()" inside the sub_iter loop. However in line train.py:68 the "optimizer.zero_grad()" is applied outside the loop.
I am not sure if this is intended as to my knowledge the zero grad should be done between each backwards operation? This means the zero_grad should either be inside the sub_iter loop or the loss.backwards() should be replaced with a batch_loss.backwards() outside of the loop.
The text was updated successfully, but these errors were encountered:
In line train.py:100, The "loss.backward()" inside the sub_iter loop. However in line train.py:68 the "optimizer.zero_grad()" is applied outside the loop.
I am not sure if this is intended as to my knowledge the zero grad should be done between each backwards operation? This means the zero_grad should either be inside the sub_iter loop or the loss.backwards() should be replaced with a batch_loss.backwards() outside of the loop.
The text was updated successfully, but these errors were encountered: