This repository has been archived by the owner on Mar 12, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 2.5k
I think there are some errors in the posted code #619
Comments
This one really confuses me a lot, so I'd love for someone to answer this question. |
I've actually posted this question before but it wasn't answered |
Still no answer to my question. |
Still no answer to my question today. |
hahaha you are so cute |
This is handled by the transformation/augmentation pipeline, there is a function Line 242 in 3af9fa8
Similarly, in the DETR implementation in HuggingFace (which is somewhat copied from here), there is an additional step in the |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Instructions To Reproduce the 🐛 Bug:
boxes[:, 2:] += boxes[:, :2]
boxes[:, 0::2].clamp_(min=0, max=w)
boxes[:, 1::2].clamp_(min=0, max=h)
we can easily see that the annotations of bounding box is format like (x,y,x,y)
loss_giou = 1 - torch.diag(box_ops.generalized_box_iou(
box_ops.box_cxcywh_to_xyxy(src_boxes),
box_ops.box_cxcywh_to_xyxy(target_boxes)))
you use box_ops.box_cxcywh_to_xyxy(target_boxes)) that means the annotations of bounding box is format like (cx,cy,w,h),This is a serious contradiction, am I misinterpreting it or is there a real problem with the code.
The text was updated successfully, but these errors were encountered: