You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
size mismatch for module.head.mlp.0.weight: copying a param with shape torch.Size([4096, 2048]) from checkpoint, the shape in current model is torch.Size([2048, 2048]).
size mismatch for module.head.mlp.0.bias: copying a param with shape torch.Size([4096]) from checkpoint, the shape in current model is torch.Size([2048]).
can someone explain me the problem?
i think the problem is that full_ckpt file, does not contain full weights of mlp heads and it's kinda incomplete
The text was updated successfully, but these errors were encountered:
Because you have a checkpoint in output_dir which will be loaded by utils.restart_from_checkpoint , another hint is you use the arch as resent50 it's also have the Dino_head and MultiCropWrapper warp around look line 133.
when i try to fine-tune pre_trained model (using full ckpt) with a different dataset using following command :
!python "main_dino.py" --arch resnet50 --batch_size_per_gpu 32 --epochs=20 --data_path "/content/dataset/train" --output_dir "/content/model_checkpts"
i got this error :
size mismatch for module.head.mlp.0.weight: copying a param with shape torch.Size([4096, 2048]) from checkpoint, the shape in current model is torch.Size([2048, 2048]).
size mismatch for module.head.mlp.0.bias: copying a param with shape torch.Size([4096]) from checkpoint, the shape in current model is torch.Size([2048]).
can someone explain me the problem?
i think the problem is that full_ckpt file, does not contain full weights of mlp heads and it's kinda incomplete
The text was updated successfully, but these errors were encountered: