Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multi-network using for the distill procedure #16

Open
lukeju opened this issue Dec 7, 2021 · 2 comments
Open

multi-network using for the distill procedure #16

lukeju opened this issue Dec 7, 2021 · 2 comments

Comments

@lukeju
Copy link

lukeju commented Dec 7, 2021

Hi! Firstly, I would like to thank you for your outstanding work speeding up NeRF.

I have trained a multi-network in the pretrain procedure(with 27 middle MLP).I want to know can I use this model as the pretrain in your code during the distill procedure.

@lukeju lukeju closed this as completed Dec 7, 2021
@lukeju lukeju reopened this Dec 7, 2021
@creiser
Copy link
Owner

creiser commented Dec 7, 2021

Hi :) You trained a multi network consisting of 27 middle-sized MLPs from scratch and want to use these for bootstrapping a multi network with a higher number of small MLPs? Sounds like a cool idea. The current code only supports distillation from a single network to a multi network, but it should not be hard to adapt it to suit your use case: multi network -> multi network.

@lukeju
Copy link
Author

lukeju commented Dec 8, 2021

Okay,thank you for your reply,I'm now adapting the code.

@lukeju lukeju closed this as completed Dec 8, 2021
@lukeju lukeju reopened this Dec 12, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants