Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PrimalDual Program issue in the main branch #304

Open
hfaghihi15 opened this issue Oct 13, 2021 · 3 comments
Open

PrimalDual Program issue in the main branch #304

hfaghihi15 opened this issue Oct 13, 2021 · 3 comments
Assignees
Labels
document missing Something need more clear documentation Possible Bug

Comments

@hfaghihi15
Copy link
Collaborator

@guoquan Could you please clarify an issue with the primal-dual loss call?

@auszok claims that the current implementation of PrimalDualProgram doesn't call the loss function that he has provided for computing the soft logic transformation, is that correct?

@hfaghihi15 hfaghihi15 added document missing Something need more clear documentation Possible Bug labels Oct 13, 2021
@hfaghihi15
Copy link
Collaborator Author

As discussed the PrimalDual loss is called after a specific number of steps known as the warmup iters. @auszok is going to check the loss call given that the warmup iter should pass and then we can see if this problem exists or not.

@guoquan
Copy link
Collaborator

guoquan commented Oct 23, 2021

@auszok claims that the current implementation of PrimalDualProgram doesn't call the loss function that he has provided for computing the soft logic transformation, is that correct?

PrimalDual uses the following method call from Andrzej to retrieve constraint loss. Not sure if he includes soft constraint in this method.

# call the loss calculation
# returns a dictionary, keys are matching the constraints
constr_loss = datanode.calculateLcLoss()
lmbd_loss = []
for key, loss in constr_loss.items():

As discussed the PrimalDual loss is called after a specific number of steps known as the warmup iters. @auszok is going to check the loss call given that the warmup iter should pass and then we can see if this problem exists or not.

In PDProgram, I already skip calling constraint loss in warmup iterations

if iter < c_warmup_iters:
loss = mloss
else:
closs, *_ = self.cmodel(output[1])
loss = mloss + self.beta * closs

You can check if there is any side-effect I did not expect that cause issue in PDProgram.

@hfaghihi15
Copy link
Collaborator Author

It seems like this problem is resolved, am I right? @auszok

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
document missing Something need more clear documentation Possible Bug
Projects
None yet
Development

No branches or pull requests

3 participants