-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation #6
Comments
also got the same problem |
same problem |
Which Pytorch version do you use? |
I got the same error with Pytorch 0.4.1.
|
Update: worked for me when I used pytorch 0.3.0. The error no longer came up. |
what the checkpoint name,and where can download it? |
it worked when I used pytorch 0.3.1 but i was not found the change in pytorch 0.4.x |
i also got the same problem in pyTorch 0.4.0 |
changing your dropout or relu6 to out-of-place or use the model of https://github.com/tonylins/pytorch-mobilenet-v2 |
thanks ! When I did this, the problem was solved. |
search the keyword "inplace" in train.py and delete the args like inplace=True,or change inplace=False,I solved the problem,it's a pytorch version problem,the author used version 0.4,and yours in version 0.4. |
I try the code and get the runtime error down,any idea?
Training...
Epoch-0-: 0%| | 0/391 [00:00<?, ?it/s]Traceback (most recent call last):
File "main.py", line 46, in
main()
File "main.py", line 34, in main
trainer.train()
File "/home/qiaokang/MobileNet-V2-master/train.py", line 63, in train
cur_loss.backward()
File "/usr/local/lib/python3.5/dist-packages/torch/tensor.py", line 93, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File "/usr/local/lib/python3.5/dist-packages/torch/autograd/init.py", line 89, in backward
allow_unreachable=True) # allow_unreachable flag
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
The text was updated successfully, but these errors were encountered: