You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've read your paper which is an excellent work. But when I was reading your code, I have questions about the procedures of updating memory module.
From my perspective, the m_items in Train.py is the memory module, whose grad is False. So the backward operations of the separateness loss and compactness_loss do not have any impact on memory. However, the paper discusses how these two loss functions can influence memory. The reason for including these two loss functions in the overall loss of Train.py is not explicitly stated. My questions are similar to #52
The text was updated successfully, but these errors were encountered:
I've read your paper which is an excellent work. But when I was reading your code, I have questions about the procedures of updating memory module.
From my perspective, the m_items in Train.py is the memory module, whose grad is False. So the backward operations of the separateness loss and compactness_loss do not have any impact on memory. However, the paper discusses how these two loss functions can influence memory. The reason for including these two loss functions in the overall loss of Train.py is not explicitly stated. My questions are similar to #52
The text was updated successfully, but these errors were encountered: