-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About loss function #16
Comments
Hi @lsj910128 , have you solved this? I'm trying to do the same but cannot figure out how to do it. This is my prototxt file, note that I added an extra python layer (BinaryMaskLayer) to convert values of classes in a 2 array into binary values: name: "VGG_ILSVRC_16_layers" #========= RPN ============ layer { layer { layer { layer { layer { layer { layer { #========= RoI Proposal ============ layer { layer { layer { layer { #========= RCNN ============ layer { ##############Mask branch#################################### Conv-Relu 1layer { layer { Conv-Relu 2layer { layer { Deconv 1layer { Conv-Relu 3layer { layer { Conv-Relu 4layer { layer { Deconv 2layer { Conv-Relu 5layer { layer { Conv-Relu 6layer { layer { Deconv 3layer { layer { layer { layer { |
Hi!
I want to change
multinomial cross entropy loss for affordance detection branch based on softmax
intobinary cross entropy loss based on sigmod
, how can I do?I try to change
train.prototxt
file as follow:`layer {
name: "mask_score"
type: "Convolution"
bottom: "mask_deconv3" #
top: "mask_score"
param { lr_mult: 1.0 decay_mult: 1.0 }
param { lr_mult: 2.0 decay_mult: 0 }
convolution_param {
#num_output: 10 # 9 affordance classes + 1 background
#num_output: 1# output will be 1x1x14x14 --> for using SigmoidCrossEntropyLoss
num_output: 2# output will be 1x2x14x14 --> for using Softmax. Actually, binomial cross-entropy loss
#(sigmoid + cross entropy) = logistic regression = two classes softmax regression
kernel_size: 1 pad: 0
weight_filler {type: "gaussian" std: 0.01 } #weight_filler { type: "xavier" }
bias_filler { type: "constant" value: 0 }
}
}
layer {
name: "loss_mask"
type: "SoftmaxWithLoss"
#bottom: "mask_score_reshape"
bottom: "mask_score"
bottom: "mask_targets"
top: "loss_mask"
loss_weight: 3
loss_param {
ignore_label: -1
normalize: true
#normalize: false
}
propagate_down: true # backprop to prediction
propagate_down: false # don't backprop to labels
}`
and set the base_lr = le-10 (large base_lr doesn't work). But the loss is very random, sometimes is very big to 100 and sometimes is small to 6. I can't see a downward trend of the loss.
The text was updated successfully, but these errors were encountered: