You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @layumi , I noticed that in your paper, the uncertainty is obtained by computing both MC-Dropout and difference between two classifiers, but I cannot find the code used for MC-Dropout, could you please kindly tell me?
The text was updated successfully, but these errors were encountered:
Hi @mitming
MC-Dropout denotes that during inference stage, we still apply the dropout. In this way, we could estimate the uncertainty by the difference of the two predictions.
In our case, it is quite easy to implement. Since we estimate the uncertainty during training, dropout is already enabled.
Hi @layumi , I noticed that in your paper, the uncertainty is obtained by computing both MC-Dropout and difference between two classifiers, but I cannot find the code used for MC-Dropout, could you please kindly tell me?
The text was updated successfully, but these errors were encountered: