Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NMS Operator Output Different From Torchvision Implementation #21898

Open
K-prog opened this issue Aug 28, 2024 · 1 comment
Open

NMS Operator Output Different From Torchvision Implementation #21898

K-prog opened this issue Aug 28, 2024 · 1 comment
Labels
stale issues that have not been addressed in a while; categorized by a bot

Comments

@K-prog
Copy link

K-prog commented Aug 28, 2024

Describe the issue

I am trying to encapsulate the torchvision.ops.nms function in an onnx model, there is no problem in the model conversion and inference but the output of the derived onnx model differs from the torchvision implementation.

To reproduce

import torch
import torchvision
import onnxruntime
import numpy as np

class NMS(torch.nn.Module):
    def __init__(self, iou_threshold=0.45):
        super(NMS, self).__init__()
        self.iou_threshold = iou_threshold

    def forward(self, x):
        boxes = x[:, :4]
        scores = x[:, 4]
        keep = torchvision.ops.nms(boxes, scores, self.iou_threshold)
        return keep

# Test data
nms_x = torch.rand(50, 38)

# PyTorch model
nms_model = NMS()
torch_output = nms_model(nms_x)

# Export to ONNX
torch.onnx.export(nms_model, 
                  (nms_x,),
                  "nms.onnx",
                  opset_version=17, 
                  input_names=["input"],
                  output_names=["output"],
                  #dynamic_axes={'input': {0: 'batch'}, 'output': {0: 'batch'}}
                  )

# ONNX Runtime inference
session = onnxruntime.InferenceSession("nms.onnx")

onnx_output = session.run(None, {'input': nms_x.numpy()})[0]

print("PyTorch output shape:", torch_output.shape)
print("ONNX output shape:", onnx_output.shape)

image

Urgency

I am trying to create a custom model architecture that uses NMS internally. However, due to the difference in NMS, my model's output is hugely different from that of the original PyTorch model. Due to this I'm stuck at deploying it via onnxruntime, as is pretty urgent

Platform

Windows

OS Version

11

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.18.1

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Sep 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

1 participant