Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question Regarding Meta-train of PyTorch Implementation #40

Open
khaghanijavad opened this issue Aug 23, 2020 · 2 comments
Open

Question Regarding Meta-train of PyTorch Implementation #40

khaghanijavad opened this issue Aug 23, 2020 · 2 comments

Comments

@khaghanijavad
Copy link

Hi, thanks for your interesting work. I have a question regarding meta-training step in PyTorch implementation. In line 167-169 of 'meta-py' the parameters of the classifier are updated according to the equation #3 in paper using the loss for training samples. Also, the loss for validation samples (according to equation #4 and #5 in the paper) should be back-propagated to do the second loop of meta-learning. However, in the released PyTorch implementation, I can see only the forward pass using validation samples, and the loss is not back-propagated to optimise SS parameters & theta (equation #4 and #5). I would really appreciate it if you could clarify this for me. Thank you very much.

@yaoyao-liu
Copy link
Owner

Hi,

Thanks for your interest in our work.

The back-propagating is conducted as follows,

self.optimizer.step()

And the optimizer is defined as follows,

self.optimizer = torch.optim.Adam([{'params': filter(lambda p: p.requires_grad, self.model.encoder.parameters())}, \
{'params': self.model.base_learner.parameters(), 'lr': self.args.meta_lr2}], lr=self.args.meta_lr1)

where self.model.encoder.parameters() are the SS parameters and self.model.base_learner.parameters() is theta.

So the SS parameters and theta are optimized by the meta loss.
If you have any further questions, feel free to add additional comments.

Best,
Yaoyao

@khaghanijavad
Copy link
Author

Thank you very much for your prompt response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants