Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

forward() does not recognize own class attributes #7

Open
afonso-sousa opened this issue Nov 9, 2020 · 2 comments
Open

forward() does not recognize own class attributes #7

afonso-sousa opened this issue Nov 9, 2020 · 2 comments

Comments

@afonso-sousa
Copy link

Hello. First of all, congratulation on your work. I would like to reproduce your work but I am facing a strange problem. When trying to use your DSQConv layer, I get the following error:
"torch.nn.modules.module.ModuleAttributeError: 'DSQConv' object has no attribute 'running_lw'"
Changing register_buffer for simple attribute assignment did not fix it. Can you reproduce the problem or help me in any way?

I would also like to ask you how can I store the models in an encoded way to compare the storage savings of lower bit range solutions.

Thank you in advance.

@mkimhi
Copy link

mkimhi commented Dec 12, 2021

Hello. First of all, congratulation on your work. I would like to reproduce your work but I am facing a strange problem. When trying to use your DSQConv layer, I get the following error: "torch.nn.modules.module.ModuleAttributeError: 'DSQConv' object has no attribute 'running_lw'" Changing register_buffer for simple attribute assignment did not fix it. Can you reproduce the problem or help me in any way?

I would also like to ask you how can I store the models in an encoded way to compare the storage savings of lower bit range solutions.

Thank you in advance.

Did you figure out a solution? i'm having the same issue

@yyl-github-1896
Copy link

Hi, I had the same problem with you, and I fixed it. It seems that the versioning of "PyTransformer" respository has some problems. If you directly clone the respository, for example, the file "/PyTransformer/transformers/quantize.py" will has a length of 146 lines, but in the original respository, it should has 217 lines. You can download the "PyTransformer" in .zip format and unzip it to use.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants