Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bias initialization #12

Open
JohnMBrandt opened this issue Oct 18, 2019 · 1 comment
Open

Bias initialization #12

JohnMBrandt opened this issue Oct 18, 2019 · 1 comment

Comments

@JohnMBrandt
Copy link

The original focal loss paper initialize the bias on the final sigmoid conv layer like so:

init = tf.constant_initializer([-np.log(0.99/0.01)])

so that negative examples will start training with no loss, and positive examples will have very high loss. Did you experiment with this bias setting? I could not find good results with it with FTL.

@nabsabraham
Copy link
Owner

Hi John, I initialize all weights to follow glorot_normal distribution but did not experiment with the original paper's init method. Is there a large difference in DSC when using the init versus not using it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants