Expected behaviour of re-runs after model changes #1617
Replies: 4 comments 13 replies
-
There's no documentation I can point you to because what's going on internally is far from trivial. Essentially, when you change the model, HiGHS will try to start simplex as efficiently as it can. I've got a guess as to what's happening, but need some more information When you're just adding columns, what bounds do they have? It matters whether they are both finite, and whether 0 is a feasible value for the variable. When you are rounding, what are the bounds on the helper column? When you're adding columns after rounding, what bounds do they have? It matters whether they are both finite, and whether 0 is a feasible value for the variable. Another thing you can do that will give me more information is to set simplex_strategy to 1. This will ensure that the dual simplex algorithm is always used. It's possible that when you're adding columns the LP remains primal feasible, in which case HiGHS will choose to use the primal simplex algorithm to restore dual feasibility. This should be more efficient than using the dual simplex algorithm, which would have to regain dual feasibility (Phase 1) and then get primal feasibility (Phase 2). However, the implementation of the dual simplex algorithm is more efficient. |
Beta Was this translation helpful? Give feedback.
-
I understand! To answer your questions: Every column is bound between or equal to 0 and 1. Each column belongs to a single "rounding group", each rounding group contains one or more columns. An "Entity group" in turn contains multiple rounding groups. From the start of the problem. The sum of all columns belonging to an entity group (so nested relation) is equal to 1 as a constraint. This means that the sum of columns for a rounding group is bound between 0 and 1. In the rounding step we select a rounding group and add as constraint the specifically the sum of those columns must also be equal to 1, making the other rounding groups in that entity group equal to 0. The helper variable is another column in the entity group that ensures a first feasible solution is possible (by not violating other constraints). This has a high penalty and subsequent column generation will ensure it won't be used in the end solution. Note: this helper variable right now is not part of the rounding group that is fixed. So when we do a rounding step we start of from an infeasible point. Is that relevant? Will try with the other simplex method! But the default value is already 1 right? If I set it explicitly, will that ensure the strategy will stay they same throughout all iterations? |
Beta Was this translation helpful? Give feedback.
-
No, if settings are changed internally - as is necessary algorithmically - user settings are restored |
Beta Was this translation helpful? Give feedback.
-
@jajhall as a follow up question. Some of my regular column costs sometimes approach 1e12 (not often). These now results in some times the big M columns (1e12) not resulting in the expected behavior. Would you recommend I scale the costs of my regular columns but keep the ghost columns as is? I would the relative size difference again result the previous problems? |
Beta Was this translation helpful? Give feedback.
-
Hi, I was wondering if there is more documentation on how resolving with model changes works. My use case is column generation with three steps:
In the rounding heuristic a single constraint is added, the bounds of some columns are set to zero, and a helper column is added with high penalty to prevent infeasibilities.
When I add columns the run time increases slightly which makes sense.
However the moment I do the first rounding there is a big jump in run time. Is it expected that adding and updating rows is a lot more time consuming for a consecutive run? In later rounding steps, there is not such a jump.
The thing that surprises me most however is that even after rounding, and we go back to only solving and adding columns, the run time remains high. But so at that point the problem is not much larger than before. Almost as if the first time we add and update rows, the warm start functionality gets "disabled".
Below you find some metrics I tracked. Solving is just timing the call to run.
(I am using highspy)
Beta Was this translation helpful? Give feedback.
All reactions