You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
so that negative examples will start training with no loss, and positive examples will have very high loss. Did you experiment with this bias setting? I could not find good results with it with FTL.
The text was updated successfully, but these errors were encountered:
Hi John, I initialize all weights to follow glorot_normal distribution but did not experiment with the original paper's init method. Is there a large difference in DSC when using the init versus not using it?
The original focal loss paper initialize the bias on the final sigmoid conv layer like so:
init = tf.constant_initializer([-np.log(0.99/0.01)])
so that negative examples will start training with no loss, and positive examples will have very high loss. Did you experiment with this bias setting? I could not find good results with it with FTL.
The text was updated successfully, but these errors were encountered: