Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

C# inferenceSession.Run Execution will be crash #18910

Closed
cvixxt opened this issue Dec 22, 2023 · 2 comments
Closed

C# inferenceSession.Run Execution will be crash #18910

cvixxt opened this issue Dec 22, 2023 · 2 comments
Labels
api:CSharp issues related to the C# API ep:CUDA issues related to the CUDA execution provider platform:windows issues related to the Windows platform

Comments

@cvixxt
Copy link

cvixxt commented Dec 22, 2023

Describe the issue

Hello everyone, I have a question to ask.
When I execute inferenceSession.Run(input), my program will become unresponsive and crash, but I can't find the relevant error code to help me overcome this problem. As shown in the figure below, my program disappears directly after execution and does not return any error code.
But I used the OnnxRunTime CPU mode, it's can work, but the GPU mode can not to work.
Please help me.

image

image

My environment is

cuda 11.8
cuDnn 8.5.0.96
OnnxRunTime.GPU 1.16.0
frameworkNET6.0

To reproduce

image

Urgency

This is a bit anxious.

Platform

Windows

OS Version

11

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.16.0 with GPU

ONNX Runtime API

C#

Architecture

X64

Execution Provider

CUDA

Execution Provider Library Version

CUDA 11.6 cuDnn 8.5.0.96

@github-actions github-actions bot added ep:CUDA issues related to the CUDA execution provider platform:windows issues related to the Windows platform labels Dec 22, 2023
@yf711 yf711 added the api:CSharp issues related to the C# API label Dec 22, 2023
@skottmckay
Copy link
Contributor

You could set the log severity level to ORT_LOGGING_LEVEL_VERBOSE using the SessionOptions class. Pass the sessino options to the InferenceSession constructor.

https://onnxruntime.ai/docs/api/csharp/api/Microsoft.ML.OnnxRuntime.SessionOptions.html

Not clear what you mean by my program disappears directly after execution and does not return any error code.. Is 'execution' in that statement the call to InferenceSession.Run or something more?

Have you looked at the system logs for any error info using EventViewer?

Not clear how ExtractPixels is implemented and whether there are any potential issues with the lifetime of the input data. I'd try running with dummy data to rule out any issues like that. e.g. something simple like a local variable of a DenseTensor with random data just to test if it crashes or not.

@cvixxt
Copy link
Author

cvixxt commented Dec 27, 2023

Hi skottmckay

I've resolved the issue and, as mentioned in issue #18051, Onnxruntime 1.15.1 does support 4080.

I tested the following:

GeForce RTX 4090
CUDA 11.8
cudnn 8.9.4.25
TensorRT-8.6.1.6.Windows10.x86_64.cuda-11
onnxruntime-GPU 1.15.1

This method works for me. After I reimported cuDnn 8.9.4.25, both onnxruntime-gpu 1.14.1 and 1.15.1 worked. Thank you for your reply and answers. I'll end this discussion.

@cvixxt cvixxt closed this as completed Dec 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api:CSharp issues related to the C# API ep:CUDA issues related to the CUDA execution provider platform:windows issues related to the Windows platform
Projects
None yet
Development

No branches or pull requests

3 participants