Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

should I add set_requires_grad(net_d, True/False) for discriminator during training? #15

Open
vince2003 opened this issue Sep 11, 2020 · 0 comments

Comments

@vince2003
Copy link

vince2003 commented Sep 11, 2020

Hello,
Thank you for your great work. However, I think you should add set_requires_grad(net_d, True/False) for discriminator during training. Is it true?

modified code:

    # (1) Update D network
    ######################
    set_requires_grad(net_d, True) # add it here
    optimizer_d.zero_grad()        
    # train with fake
    fake_ab = torch.cat((real_a, fake_b), 1)
    pdb.set_trace()
    pred_fake = net_d.forward(fake_ab.detach())
    loss_d_fake = criterionGAN(pred_fake, False)

    # train with real
    real_ab = torch.cat((reala, real_b), 1)
    pred_real = net_d.forward(real_ab)
    loss_d_real = criterionGAN(pred_real, True)
    
    # Combined D loss
    loss_d = (loss_d_fake + loss_d_real) * 0.5

    loss_d.backward()
   
    optimizer_d.step()

    set_requires_grad(net_d, False) # add it here

    ######################

I am looking forward to hearing from you. Thank you in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant