-
Notifications
You must be signed in to change notification settings - Fork 23
No GPU support for predictor #141
Comments
Yes, there is currently no GPU support when the trained model is deployed for inference on Rafiki. Currently, it is expected that models can always fallback to using CPU. Is there a workaround for your model? @nudles should we support using GPU for inference as well? |
Yes. It would be good to support GPU inference.
The implementation should be easy? similar to that for training.
…On Tue, Jul 16, 2019 at 8:23 PM Ngin Yun Chuan ***@***.***> wrote:
Yes, there is currently no GPU support when the trained model is deployed
for inference on Rafiki. Currently, it is expected that models can always
fallback to using CPU. Is there a workaround for your model? @nudles
<https://github.com/nudles> should we support using GPU for inference as
well?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#141?email_source=notifications&email_token=AA47DRY6Z2LT6WD4T6PVVKDP7W4RXA5CNFSM4IA54YKKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD2AVPEQ#issuecomment-511793042>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AA47DR74UUEMFKBB2XV45NLP7W4RXANCNFSM4IA54YKA>
.
|
Shouldn't be that difficult. Will add as a task |
Hi @vivansxu, we've recently added this functionality on the branch |
Hi @nginyc, thank you so much for adding GPU support! I just tried to run the inference job, and it works well! By the way, since I want to return a list of strings as prediction, I changed line 64 of rafiki/predictor/ensemble.py from: |
I think iterable is more general than list. |
Hi @nudles, actually if just using Iterable here, since a string object is always iterable(explain at the end), there would be a RecursionError: maximum recursion depth exceeded in comparison If I run: for x in ('abc'): The output would be: |
May I confirm if there is no GPU support for the predictor service?
If no, how do I implement prediction for models which have to use GPUs?
Is it possible to add GPU support for the predictor to Rafiki?
Besides, I notice it is said in the documentation that it is required for the model to train and evaluate with only CPUs if there is no GPU hardware available. Since my model only supports GPU environment, do I have to follow this requirement?
Thank you.
The text was updated successfully, but these errors were encountered: