We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,感谢分享,有点疑惑请教下。
据我所知有两种剪枝实现:
如果我理解有误,欢迎指出,如果没错,那么有两个问题烦请指教: 1.本项目残差连接的剪枝方案如您所说使用了mask,就是我上面说的第二点的方式吗? 2.如果是,mask剪枝的意义在哪呢?我们剪枝不就是为了减少体积和提速吗?
参考您的一个之前的回复: ##30
mask模型的意义就在于没有实质的剪枝发生,模型的大小仍然保持一样。只是inference的时候mask模型等价于prune后的模型
The text was updated successfully, but these errors were encountered:
1.是的 2. 意义在resnet 和densenet这种复杂的结构里,有时候可能某一层的channel都会被剪枝,但是网络仍然可以正常inference,这种情况采用mask的方式实现很简单。
Sorry, something went wrong.
感谢回复!了解了,所以如果我为了加速,不使用mask,比较简单的方案就是考虑不用残差结构吧
No branches or pull requests
您好,感谢分享,有点疑惑请教下。
据我所知有两种剪枝实现:
如果我理解有误,欢迎指出,如果没错,那么有两个问题烦请指教:
1.本项目残差连接的剪枝方案如您所说使用了mask,就是我上面说的第二点的方式吗?
2.如果是,mask剪枝的意义在哪呢?我们剪枝不就是为了减少体积和提速吗?
参考您的一个之前的回复: ##30
The text was updated successfully, but these errors were encountered: