Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Queston: is there any examples of inferece with python models written with numpy (intead of pytorch models)? #296

Open
KexinFeng opened this issue Jan 13, 2023 · 3 comments

Comments

@KexinFeng
Copy link

I notice in the introduction that

torch::deploy (MultiPy for non-PyTorch use cases) is a C++ library that enables you to run eager mode PyTorch models in production without any modifications to your model to support tracing.

Also most of the examples provided are with PyTorch. Is there any examples of inferece with python-written models (intead of pytorch models)? For example, can I do inference here with xgboost or lightgbm or simple decision tree written in python with numpy?

@KexinFeng KexinFeng changed the title Queston: is there any examples of inferece with python-written models (intead of pytorch models)? Queston: is there any examples of inferece with python models written with numpy (intead of pytorch models)? Jan 13, 2023
@PaliC
Copy link
Contributor

PaliC commented Jan 19, 2023

The main limitation to using non-PyTorch is actually how we receive and send values to the interpreter. We use IValue which is native to PyTorch. If you can get numpy to convert to and from IValue you should be able to accomplish this.

You can also add in a converter with our plugin registry https://github.com/pytorch/multipy/blob/main/multipy/runtime/interpreter/plugin_registry.h (and then add in support for the plug in).

Eventually we do hope to add a more generic interface for the interpreters. However, due to staffing issues this is a ways away :(

@PaliC PaliC closed this as completed Jan 19, 2023
@KexinFeng
Copy link
Author

KexinFeng commented Jan 19, 2023

Thanks for the answer! There is still one thing I'm a little confused about. Intuitively, it seems that a python script can still have numpy dependency. For example, here
https://github.com/pytorch/multipy/blob/main/README.md#packaging-a-model-for-multipyruntime
the numpy packages installed in the system will be searched.

Does this mean as long as I have numpy package installed, the python interpreters in MultiPy will load them during runtime and still be able to do multi-threading inference?

Also, I indeed found that NumPy is somewhat supported by MultiPy. For example, it can be imported with following code
I.global("numpy", "random"), which is from

TEST(TorchpyTest, TestNumpy) {
torch::deploy::InterpreterManager m(2);
auto noArgs = at::ArrayRef<torch::deploy::Obj>();
auto I = m.acquireOne();
auto mat35 = I.global("numpy", "random").attr("rand")({3, 5});
auto mat58 = I.global("numpy", "random").attr("rand")({5, 8});
auto mat38 = I.global("numpy", "matmul")({mat35, mat58});
EXPECT_EQ(2, mat38.attr("shape").attr("__len__")(noArgs).toIValue().toInt());
EXPECT_EQ(3, mat38.attr("shape").attr("__getitem__")({0}).toIValue().toInt());
EXPECT_EQ(8, mat38.attr("shape").attr("__getitem__")({1}).toIValue().toInt());
}

Given this, I'm wondering why we still need to register the NumPy interface and

get numpy to convert to and from IValue

as you mentioned above.

@PaliC
Copy link
Contributor

PaliC commented Jan 20, 2023

Sorry for closing this, here's the response I made to the reask of the question in #301

Yup this is exactly it. IValue isn't needed for the internals of the interpreter. We just use the type to interact with the interpreters. For numpy we haven't done thorough testing, so we can't provide any guarantees. Though you're right in that things should generally just work (IValue does cover a lot haha just not everything).

For the plugins/convertors (the interface I think you're referring to), currently we use IValue as an intermediary to convert a pyobject to something usable in C++. For example on line 501 you go from pyobject->IValue->int. However, eventually we'd like to create a custom convertor get more coverage.

Sorry to be more clear if IValue works for your use cases feel free to use it. However, if there are objects which you can't get out of IValue, you'd want to write your own convertor.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants