Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

uncontrolled parameter domain #160

Open
AxelBreuer opened this issue Oct 22, 2024 · 0 comments
Open

uncontrolled parameter domain #160

AxelBreuer opened this issue Oct 22, 2024 · 0 comments

Comments

@AxelBreuer
Copy link

Hi,

When specifying a CVXPY problem with parameters, we can define a domain of validity for these parameters (e.g., requiring positivity). It seems to me that CVXPY layers can override this domain silently (e.g., by setting a negative value to a parameter that is supposed to be positive).

For example, if we modify the toy example on the main webpage of cvxpylayers such that

import cvxpy as cp
import torch
from cvxpylayers.torch import CvxpyLayer

n, m = 2, 3
x = cp.Variable(n)
A = cp.Parameter((m, n))
# b = cp.Parameter(m)
b = cp.Parameter(m, pos=True)  # b is now a vector of positive numbers
constraints = [x >= 0]
objective = cp.Minimize(0.5 * cp.pnorm(A @ x - b, p=1))
problem = cp.Problem(objective, constraints)
assert problem.is_dpp()

cvxpylayer = CvxpyLayer(problem, parameters=[A, b], variables=[x])
A_tch = torch.randn(m, n, requires_grad=True)
# b_th = torch.randn(m, requires_grad=True)
b_tch = -abs(torch.randn(m, requires_grad=True))  # b_th is now a tensor of negative numbers

solution, = cvxpylayer(A_tch, b_tch)

solution.sum().backward()

The above code will work, but shouldn't it trigger an error saying that b_tch and/or b should be positive ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant