Replies: 5 comments
-
Good question But you need to take a look into the goals. 1st create cudnn alternative. Today there isn't such a thing. Even libdnn tightly coupled with caffe isn't developed any more... 2nd is to create inference library with minimal dependencies. Today if you want to do inference on gpu you need to bring some specific toolkit like cuda. cldnn, miopen that are vendor specific. MIOpen doesn't even support AMD's own hardware: ROCm/ROCm#887 Something close today is ngraph + plaidml but their approach is problematic as they relay on auto-magic and can't achieve good performance. Finally, the deep learning framework is merely side effect of the library since I need to be able to test all operators. Regarding Caffe. I wish it was developed... but it isn't. I contributed several patches in the past but at this point, nothing is get accepted or even reviewed, even something basic like for example cudnn 8 support BVLC/caffe#7000 I submitted not long ago. The author is moved to caffe2 that is now part of pytorch. So as a end goal I want to have an ability to use pytorch or tensorflow with dlprimitives as a backend for OpenCL devices rather than try to beat a dead horse (caffe, keras+plaidml). However I need to prove that I have actually working system that gives reasonable performance - this way it would be much easier to get into pytorch/tf/mxnet or any other framework as backend. |
Beta Was this translation helpful? Give feedback.
-
Adding reference to Keras/Plaidml issue: plaidml/plaidml#586 |
Beta Was this translation helpful? Give feedback.
-
Hi, So you think ngraph + plaidml is the best way to go right now to tackle incremental learning or is there anything else worth looking into? If anyone can point in the best direction that'd be gr8! I'm sure it's very complicated but i have to at least look into such a valid goal, any help from anyone sounds good to me! |
Beta Was this translation helpful? Give feedback.
-
ngraph is inference only
What are you trying to achieve? |
Beta Was this translation helpful? Give feedback.
-
sorry, does the 1st sentence of 1st post not make sense? add new things like car, bicycle etc on top of what is already learned. |
Beta Was this translation helpful? Give feedback.
-
Hi,
Have you considered making the learning incremental? Being able to teach it new things without starting from scratch.
From what I gather it's a technical challenge, but it's a sure way to have solid future, surely caffe would still be going strong.
Thanks for any insights.
Beta Was this translation helpful? Give feedback.
All reactions