You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I greatly appreciate your work and clearly written code which gives incredible insights into the back propagation technique. I've encountered a bit of a bug which is pretty solvable, but I don't want to make a pull request as I'm not sure of default values here.
It's at layers.py:400 (at the end of the line, last param):
The last param is supposed to be in the string-style enum format of padding type. It's passing a literal 0 when it should be passing self.padding. PoolingLayer should also have a valid default value for self.padding which is also 0 (which of course causes this same error). In the case of 0 being an acceptable default, that value should be acceptable by the receiving function determine_padding, which is where the error is raised:
I greatly appreciate your work and clearly written code which gives incredible insights into the back propagation technique. I've encountered a bit of a bug which is pretty solvable, but I don't want to make a pull request as I'm not sure of default values here.
It's at layers.py:400 (at the end of the line, last param):
ML-From-Scratch/mlfromscratch/deep_learning/layers.py
Line 400 in a2806c6
The last param is supposed to be in the string-style enum format of padding type. It's passing a literal 0 when it should be passing self.padding. PoolingLayer should also have a valid default value for self.padding which is also 0 (which of course causes this same error). In the case of 0 being an acceptable default, that value should be acceptable by the receiving function determine_padding, which is where the error is raised:
ML-From-Scratch/mlfromscratch/deep_learning/layers.py
Line 718 in a2806c6
Again, thank you for this repository. Amazing work.
The text was updated successfully, but these errors were encountered: