Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about test #83

Open
maoshen5 opened this issue Jul 17, 2022 · 10 comments
Open

about test #83

maoshen5 opened this issue Jul 17, 2022 · 10 comments

Comments

@maoshen5
Copy link

maoshen5 commented Jul 17, 2022

Now I have experimented with a framework in which the generator is transgan and the discriminator is autogan, but it doesn't seem to converge? Epoch is 320, and the experimental result FID is 130. What tricks did you use in the experiment?Thank you

@maoshen5
Copy link
Author

Now I have experimented with a framework in which the generator is transgan and the discriminator is autogan, but it doesn't seem to converge? Epoch is 320, and the experimental result FID is 130. What tricks did you use in the experiment?Thank you

@yifanjiang19
Copy link
Contributor

yeah, training GAN is comparably tricky, I would suggest you to go with AutoGAN's repo and run the experiments, make sure you get the reasonable FIDs and then replace AutoGAN's generator with Transformer. Then you will get the expected output.

@maoshen5
Copy link
Author

OK,But isn't the weight propagation of the pre training model the same? Can we still use the pre training model of autogan after this change?

@maoshen5
Copy link
Author

maoshen5 commented Aug 7, 2022

I use autogan's generator and discriminator to train to 180 epochs, is to train to 8.3, and FID to train to 15. However, when I use this model to train the generator to be transformer, and the discriminator is an autogan model, after training from 180 to 200 epochs, is suddenly reduced to 1.3, and FID rises to more than 320. Is this normal?

@yifanjiang19
Copy link
Contributor

this is not normal. I've tried it before and the IS should be around 8.7-8.8 @maoshen5
I think you should try to tune the hyper parameter and learning rate

@maoshen5
Copy link
Author

OK,then,do I need to prohibit the training of certain epoch autogan discriminators to continue training, and only train the generator of transformer?

@yifanjiang19
Copy link
Contributor

no, just follow the standard training of AutoGAN.

@maoshen5
Copy link
Author

maoshen5 commented Sep 6, 2022

I adjusted the learning rate, but now I can only adjust it to about 100 epochs at most. Is is 7 and FID is 30. But after 110 epochs, the loss of the generator will change around 0, and the loss of the discriminator will not drop any more. The FID index will also rise. I suspect that the generator's ability completely exceeds the discriminator after 100 epochs. When you train to 320 epochs, is the indicator of GAN still rising? Thank you

@yifanjiang19
Copy link
Contributor

yeah, I had same observation that IS/FID will get worse after longer training. But that IS is > 8.0 in my case. Sorry, I don't have the script to run this now.

@dipth11
Copy link

dipth11 commented Oct 27, 2023

I adjusted the learning rate, but now I can only adjust it to about 100 epochs at most. Is is 7 and FID is 30. But after 110 epochs, the loss of the generator will change around 0, and the loss of the discriminator will not drop any more. The FID index will also rise. I suspect that the generator's ability completely exceeds the discriminator after 100 epochs. When you train to 320 epochs, is the indicator of GAN still rising? Thank you

Hello, did you solve this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants