Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code to use segmentation model with torch-serve #25

Open
plbenveniste opened this issue Jul 17, 2024 · 0 comments
Open

Code to use segmentation model with torch-serve #25

plbenveniste opened this issue Jul 17, 2024 · 0 comments
Assignees

Comments

@plbenveniste
Copy link
Collaborator

Opening this issue to detail my investigation on how to use torch-serve for running inference with the Monai built model.

This work is in the context of the PACS-AI project.

Response I got:

As a first step, your model should be compatible with torchserve or tensorflow serve. In PACS-AI, the torchserve microservice loads models in .mar file (https://pytorch.org/serve/use_cases.html). If you can serve the model on your side following the torchserve guide, integrating it in pacs ai after will be very easy (we will give detailled steps this summer)!
We plan to also support docker images as an "endpoint" to run inferences as it could be necessary in certain cases where certain CLI tools are used! But the idea would remain the same as the input/output will need to follow a certain format

@plbenveniste plbenveniste self-assigned this Jul 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant