-
Apologies if this has been asked before, but I trained a DPRNNTasNet model when it stopped due to an error. I wasn't able to determine the issue, but the model training was able to collect these files: I've been trying to follow these steps these steps to reload the model, however, the model does not save the checkpoint in a way that matches what PyTorch expects: What's the correct way to reload a saved checkpoint after model training? Is it a model-specific issue, or a general PyTorch Lightning issue? Here is the code I use to load the saved checkpoint into memory: |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
Found the answer I was looking for here. |
Beta Was this translation helpful? Give feedback.
-
Hello! I have the same question and the link provided doesn't exist anymore. I succeeded to load the model with the (not optimal) code below: import torch
from asteroid import ConvTasNet
ckpt_path = '/path/to/checkpoint.ckpt'
checkpoint = torch.load(ckpt_path, map_location=torch.device('cpu'))
state_dict = {key[6:] : checkpoint['state_dict'][key] for key in checkpoint['state_dict']}
model = ConvTasNet(n_src=2)
model.load_state_dict(state_dict) And to load the trainer with the code below: import pytorch_lightning as pl
from asteroid.engine.system import System
from asteroid.losses import PITLossWrapper, pairwise_neg_sisdr
from asteroid import ConvTasNet
from torch.optim import Adam
from asteroid.data import LibriMix
model = ConvTasNet(n_src=2)
optimizer = Adam(model.parameters(), lr=1e-3)
loss_func = PITLossWrapper(pairwise_neg_sisdr, pit_from='pw_mtx')
train_loader, val_loader = LibriMix.loaders_from_mini(task="sep_clean", batch_size=16)
system = System(model=model, loss_func=loss_func, optimizer=optimizer,
train_loader=train_loader, val_loader=val_loader)
trainer = pl.Trainer()
ckpt_path = '/path/to/checkpoint.ckpt'
trainer.fit(system, ckpt_path=ckpt_path) Is there a more elegant way to only load the model from the checkpoint? Thanks! 🙏 Environment:
|
Beta Was this translation helpful? Give feedback.
Found the answer I was looking for here.