You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is pull request #131 for adding ReLU to regular feed-forward network (nnff() and nnbp()). Maybe you can borrow some ideas from there. Certainly you need to add ReLU support to backprop as well.
My experience with DeepLearnToolbox CNN code is, that it is unbearably slow and rather limited. For example it doesn't support fully-connected layers at all. You may have better luck with MatConvNet, which seems to be quite full-featured, but admittedly more complex.
Is it possible to replace sigm activation function to relu in CNN? I tried to replace sigm to relu in cnnff.m but it doesn't work.
I guess this also requires changes to the backprop derivatives?
The text was updated successfully, but these errors were encountered: