Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RunAsync doesn't work unless Run precedes it #18767

Closed
vymao opened this issue Dec 9, 2023 · 3 comments
Closed

RunAsync doesn't work unless Run precedes it #18767

vymao opened this issue Dec 9, 2023 · 3 comments
Labels
api issues related to all other APIs: C, C++, Python, etc. stale issues that have not been addressed in a while; categorized by a bot

Comments

@vymao
Copy link

vymao commented Dec 9, 2023

Describe the issue

I'm encountering a strange bug when trying to run RunAsync. For a given session, I cannot seem to be able to run RunAsync by itself unless I run Run(ie. synchronously) before it. I'm not sure why this is; between the two calls, I don't change any values, and for each function, I give the same set of inputs (obviously some extra for the RunAsync, like a callback).

I'm curious if anyone knows why this might be?

To reproduce

It is difficult to immediately reproduce given the large amount of code that I have. But if I run this:

        std::vector<float> input_tensor_values = data_queue.front();
        std::cout << "Input length: " << input_tensor_values.size() << std::endl;
        std::vector<Ort::Value> input_tensors = ValueVector(input_tensor_values, feature_extractor);
        std::vector<Ort::Value> ort_outputs;
        ort_outputs.emplace_back(Ort::Value{nullptr});

        atomic_wait.store(true);
        std::vector<Ort::Value> output_tensors = session->Run(Ort::RunOptions{nullptr}, input_names_arrays.data(), input_tensors.data(), input_names_arrays.size(), output_names_arrays.data(), output_names_arrays.size());
        // double-check the dimensions of the output tensors
        // NOTE: the number of output tensors is equal to the number of output nodes specifed in the Run() call
        assert(output_tensors.size() == output_names.size() && output_tensors[0].IsTensor());
        
        std::cout << "Running async..." << std::endl;

        session->RunAsync(
            Ort::RunOptions{nullptr},
            input_names_arrays.data(),
            input_tensors.data(),
            input_names_arrays.size(),
            output_names_arrays.data(),
            output_values.data(),
            output_names_arrays.size(),
            mainRunCallback,
            &data_queue);

This runs without issue, and continuously in a streaming fashion with the inputs given through data_queue. I can see the callback being called. However, if I omit the Run and instead just run the RunAsync:

        std::vector<float> input_tensor_values = data_queue.front();
        std::cout << "Input length: " << input_tensor_values.size() << std::endl;
        std::vector<Ort::Value> input_tensors = ValueVector(input_tensor_values, feature_extractor);
        std::vector<Ort::Value> ort_outputs;
        ort_outputs.emplace_back(Ort::Value{nullptr});

        atomic_wait.store(true);
        std::cout << "Running async..." << std::endl;

        session->RunAsync(
            Ort::RunOptions{nullptr},
            input_names_arrays.data(),
            input_tensors.data(),
            input_names_arrays.size(),
            output_names_arrays.data(),
            output_values.data(),
            output_names_arrays.size(),
            mainRunCallback,
            &data_queue);

This crashes. I tried enclosing the run in a try..catch:

try {
    ... // Run the session
}
catch (const Ort::Exception &exception)
    {
        std::cout << "ERROR running model inference: " << exception.what() << std::endl;
        exit(-1);
    }
}

But nothing is output. The program simply crashes/exits with no error message whatsoever. The callback is not called.

Urgency

Difficult to proceed

Platform

Mac

OS Version

13.3

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.16.2

ONNX Runtime API

C++

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@RandySheriffH
Copy link
Contributor

RandySheriffH commented Dec 11, 2023

How did u initialize "output_values"?
FYI, here's the UT:

Ort::Value output_values[1] = {Ort::Value{nullptr}};

@chenfucn chenfucn added the api issues related to all other APIs: C, C++, Python, etc. label Dec 12, 2023
Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Jan 13, 2024
Copy link
Contributor

This issue has been automatically closed due to inactivity. Please reactivate if further support is needed.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api issues related to all other APIs: C, C++, Python, etc. stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

3 participants