Skip to content

Commit

Permalink
RecurrentLayer bugfix: params still need backprop
Browse files Browse the repository at this point in the history
  • Loading branch information
jeffdonahue committed Mar 26, 2015
1 parent 80e9c41 commit d3ebf3e
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion src/caffe/layers/recurrent_layer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -208,8 +208,12 @@ template <typename Dtype>
void RecurrentLayer<Dtype>::Backward_cpu(const vector<Blob<Dtype>*>& top,
const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom) {
CHECK(!propagate_down[1]) << "Cannot backpropagate to sequence indicators.";
if (!propagate_down[0] && !propagate_down[2]) { return; }

// TODO: skip backpropagation to inputs and parameters inside the unrolled
// net according to propagate_down[0] and propagate_down[2]. For now just
// backprop to inputs and parameters unconditionally, as either the inputs or
// the parameters do need backward (or Net would have set
// layer_needs_backward_[i] == false for this layer).
unrolled_net_->Backward();
}

Expand Down

0 comments on commit d3ebf3e

Please sign in to comment.