Why does the object loss in yolox calculate the loss of all predictions? #1283
Replies: 1 comment 1 reply
-
I am also curious about something going on here, but I think I know the answer to your question. It makes sense to me that every prediction gets obj loss, but only some predictions get cls and iou loss. fg_masks indicates which predictions have been matched to ground-truth boxes. It only makes sense to have cls and iou loss if a prediction actually matches a box; you need a ground-truth class and box in order to compare with the predicted class and box. But for obj loss, you want to add loss if the model predicts an object somewhere where there isn't one. So every prediction gets obj loss, based on how close the predicted objectness is to 1 (if there is a ground-truth box there) or 0 (if there isn't a ground-truth box there). If my understanding is correct, though, I am still really confused about one thing. Why does the sum of the obj losses get divided by num_fg? I believe num_fg is the number of predictions that get matched to ground_truth boxes. It makes sense to divide by num_fg for iou and cls loss, since this amounts to taking an average over all the losses computed. But if there is obj loss for every prediction, not just the ones that match a ground-truth box, why do we divide by num_fg instead of the number of predictions for the obj loss??? Can anyone let me know what I'm missing? |
Beta Was this translation helpful? Give feedback.
-
I'm reading the source code of yolox , when I scan the code about loss computation,the reg loss and cls loss just cauculate the loss of predition witch was selected by a series of strategies,but things going to be different in obj loss , I found the object loss contains the loss of all predictions,I really can't understand it ,can anybody explain it for me ? thanks very much !!
here is the code
Beta Was this translation helpful? Give feedback.
All reactions