Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

An error occurs when training on Refer-YouTube-VOS #18

Open
hoyeYang opened this issue Jun 15, 2022 · 0 comments
Open

An error occurs when training on Refer-YouTube-VOS #18

hoyeYang opened this issue Jun 15, 2022 · 0 comments

Comments

@hoyeYang
Copy link

I follow your Environment Installation.
But when executing the following line

MTTR/trainer.py

Line 175 in c383c5b

self.lr_scheduler.step(total_epoch_loss) # note that this loss is synced across all processes

an error occurs.

Traceback (most recent call last):
  File "/media/ssd/users/xxx/software/anaconda3/envs/mttr0/lib/python3.9/site-packages/torch/multiprocessing/spawn.py", line 59, in _wrap
    fn(i, *args)
  File "/media/ssd/users/xxx/projects/MTTR/main.py", line 20, in run
    trainer.train()
  File "/media/ssd/users/xxx/projects/MTTR/trainer.py", line 175, in train
    self.lr_scheduler.step(total_epoch_loss)  # note that this loss is synced across all processes
  File "/media/ssd/users/xxx/software/anaconda3/envs/mttr0/lib/python3.9/site-packages/torch/optim/lr_scheduler.py", line 164, in step
    self.print_lr(self.verbose, i, lr, epoch)
  File "/media/ssd/users/xxx/software/anaconda3/envs/mttr0/lib/python3.9/site-packages/torch/optim/lr_scheduler.py", line 113, in print_lr
    print('Epoch {:5d}: adjusting learning rate'
ValueError: Unknown format code 'd' for object of type 'float'

Did you met this bug when training on Refer-YouTube-VOS?

If I change this line to

            self.lr_scheduler.step()  # note that this loss is synced across all processes

will it influence the result?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant