Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ReLU for CNN? #138

Open
mrgloom opened this issue May 10, 2015 · 1 comment
Open

ReLU for CNN? #138

mrgloom opened this issue May 10, 2015 · 1 comment

Comments

@mrgloom
Copy link

mrgloom commented May 10, 2015

Is it possible to replace sigm activation function to relu in CNN? I tried to replace sigm to relu in cnnff.m but it doesn't work.

function X = relu(P)
    X = max(0,P);
end


I guess this also requires changes to the backprop derivatives?

@tambetm
Copy link

tambetm commented May 10, 2015

There is pull request #131 for adding ReLU to regular feed-forward network (nnff() and nnbp()). Maybe you can borrow some ideas from there. Certainly you need to add ReLU support to backprop as well.

My experience with DeepLearnToolbox CNN code is, that it is unbearably slow and rather limited. For example it doesn't support fully-connected layers at all. You may have better luck with MatConvNet, which seems to be quite full-featured, but admittedly more complex.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants