Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generator ADAM X Discriminator SGD #48

Open
nuneslu opened this issue Oct 2, 2018 · 3 comments
Open

Generator ADAM X Discriminator SGD #48

nuneslu opened this issue Oct 2, 2018 · 3 comments

Comments

@nuneslu
Copy link

nuneslu commented Oct 2, 2018

It`s not an issue but a question. I would like to know why it would be better train the Discriminator with SGD than train both parts with ADAM. I've been trying to improve the results of my GAN and before test this I would like to know why is it better to understand this!

@makeyourownalgorithmicart

I would like to know this too.

@ianzen
Copy link

ianzen commented May 29, 2019

I think the idea is that discrimination is an easier task than generation. SGD limits discriminator optimization allowing the generator using the faster Adam to "catch up". From my experience SGD optimization is very uniform since lr is mostly fixed compared to Adam.

@DAYceng
Copy link

DAYceng commented Oct 5, 2022

I think the idea is that discrimination is an easier task than generation. SGD limits discriminator optimization allowing the generator using the faster Adam to "catch up". From my experience SGD optimization is very uniform since lr is mostly fixed compared to Adam.

I'm thinking the same thing too. I used SGD on the discriminator while training the WGAN and got better results (compared to both using Adam). I would like to know the rationale behind this trick, is there any paper you can recommend to read?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants