You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I realized that the slim.con2d,if you use the normalizer_fn , it says the conv will have no bias.
but there comes the question since it uses adaptive BN ,as I can see in the code the initial parameter is w0=1 and w1=0, which means at first it doesn't use batch normalization, So the bias is used in futher forward-propagation.and even though w1 is not 0, as long as the w0 do not becoming to 0, I think the bias is still used in someway.So,Should I use bias in conv?I was doing this in pytorch.it doesn't have a function like slim.conv2d.
The text was updated successfully, but these errors were encountered:
I realized that the slim.con2d,if you use the normalizer_fn , it says the conv will have no bias.
but there comes the question since it uses adaptive BN ,as I can see in the code the initial parameter is w0=1 and w1=0, which means at first it doesn't use batch normalization, So the bias is used in futher forward-propagation.and even though w1 is not 0, as long as the w0 do not becoming to 0, I think the bias is still used in someway.So,Should I use bias in conv?I was doing this in pytorch.it doesn't have a function like slim.conv2d.
The text was updated successfully, but these errors were encountered: