Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VRPY column generation for a branch-bound-and-price algorithm #124

Open
the-soomin-woo opened this issue Mar 2, 2022 · 10 comments
Open

Comments

@the-soomin-woo
Copy link

Hi @torressa!

I am Soomin, and have asked you a few questions already in the past and I really appreciate what you have done so far.
I have applied cspy and vrpy to develop a branch-bound-and-price algorithm for our custom vehicle routing problem with electric vehicles.
I know branch and bound is not yet implemented by you but I wish to ask you some questions still, if that's okay. Your feedback will be very valuable to me!

What I have so far: The current column generation ends after running heuristic solvers of your choice, then the exact labelling algorithm (cspy), which finds no routes found with a negative reduced cost. Currently, I am only using forward direction as some features of my problem make it difficult to model the backward direction. Also, I have modified the cspy so that the edge weight is calculated dynamically as a function of the current resource state at the edge's starting node.

Current issue: When I do the branch and bound on the edge flows (for example, parent node: no restriction on the edge flow, left child branch: edge flow of (source, 1) >= 1, right child branch: edge flow of (source, 1) = 0), I sometimes get odd results. The output of the column generation result is sometimes better with the bounding constraints than the result without the constraints (for example, the parent node is worse than the child node). Note, this is considering the column generation results with linear relaxation.
Example
In the image attached, node 4 has the LB:3.244, which is the column generation result with linear relaxation and UB:4.045, which is the column generation result without linear relaxation. However, node 2 (parent of node 4) has the LB:3.2624 > 3.244, which is actually worse than node 4.

This means adding more constraints to the RMP actually improved the RMP's objective function value. It makes me question if I am solving the column generation to optimality, which is weird because the labeling algorithm via cspy must be exact. I am currently solving the subproblems using the cspy forward direction without any other heuristics and I am really stuck on this issue. Are there any parameters in the cspy/vrpy that I must pay attention to, which may determine the criteria for the optimality or the termination of CG? I have checked the threshold to determine if the reduced costs are indeed negative. I also checked that it's not a time-out or max iteration reached. They don't seem too problematic for me.

In addition, I noticed that different initialization methods actually result in different objective function values, not just the computation time - do you know anything about this? I implemented a modified version of the greedy algorithm to add an extensive amount of initial routes.

I'd really appreciate your feedback. Thank you so much!

-Soomin

@the-soomin-woo
Copy link
Author

I am also using Best_Edges1, and remove_edges1 with alpha in [0.3, 0.5, 0.7, 0.9]

@Kuifje02
Copy link
Owner

Kuifje02 commented Mar 3, 2022

Hi there !

This looks interesting. Please consider contributing to vrpy with your branch-and-bound framework and/or your modified greedy algorithm :)

I am not suprised that the initialization method leads to different objective function values (with integer variables). I guess different columns can have the same relaxed objective function value, but different integer objective function values once the variables are integer. On the other hand, if you are referring to objective function values for the relaxed problem, then I am a bit lost to be honest.

I am not sure why this is happening in your branch and bound tree. This is odd indeed.

Perhaps we should first get the theory right, and then figure out where the implementation fails. A good place to ask such theoretical questions is over here.

@the-soomin-woo
Copy link
Author

Hi, I will definitely contribute to vrpy/cspy for my work! And thank you for the link, I will ask around there.

And yes, I meant that different initialization results in different values for the objective function of the relaxed problem, not the integer problem.

I wonder - I defined my own forward labeling algorithm with many more resources.
The edge weight is also a function of the current state (at the last label of current path before extension), so it's not fixed.
Should I have customized the dominance rules? I will also ask around in the cspy page as well.

Thank you!!

@torressa
Copy link
Collaborator

torressa commented Mar 4, 2022

Hi @soomin-woo!
Nice work!
I'm not sure but maybe the issue there is with the branching. How did you enforce the branching decisions?
I'm asking because one has to be careful when branching on arcs as the structure of the subproblem might change when adding constraints to the master problem. There are tricks for this but I don't really know that much about this, the OR-SO is a very good place to ask.

edit

Please link the question on OR-SO if you do ask, might get some super cool insight from the masters (Ruslan Sadykov et al.)

Fleshed out some RCSP logic in torressa/cspy#98

@torressa
Copy link
Collaborator

torressa commented Mar 6, 2022

Just had another thought. As the weight for each edge is being altered on the fly (dependant on resources), you may get a route which the final cost is falsely positive (or negative), which means that your linear relaxion is not (may not) being solved to optimality. The weights of each edge is set using the dual formulation (see pricing problem).
As a start you could express the charging resource separately and then pick the best routes (as per whatever charging criteria you want). Or maybe could try adding the constraints to the master problem (if it doesn't break the subproblem structure) and updating the dual components of the pricing objective appropriately.

@the-soomin-woo
Copy link
Author

@torressa, I'm sorry I took a while to answer. I was absorbing a lot of information everywhere, thanks to you!
Also I realize that you are answering me on cspy page. I will continue my reply there!

@torressa
Copy link
Collaborator

Just for reference, the question has been asked on OR-SE

@the-soomin-woo
Copy link
Author

Hi @torressa, that question was me 👍 :)

@the-soomin-woo
Copy link
Author

the-soomin-woo commented Apr 6, 2022

Hi @torressa, I found that the bug was in the pricing problem.
Below are the steps I followed to show that the bug was in the pricing problem, referring to the last comment by Ruslan Sadykov here.

  1. At the root of branch-and-bound, I process the column generation until no negative reduced cost is found. Call the objective function value at the root as 'lower bound at root'
  2. At the root, I save the dual values at the last iteration of the column generation (with linear relaxation). Call this 'root duals'. For these dual values, the pricing problem fails to find a negative column and the column generation terminates.
  3. I branch on the root, ex) left branch to use ['Source', 1] and right branch to ban ['Source', 1], and solve the column generation for each branch.
  4. I found that in one branch , one of the columns generated actually has a negative reduced cost with respect to dual at root. I calculate the negative reduced cost as cost of route - sum('root duals' for all nodes visited in the route). This is a bug because that column was not found at the root, though it has a negative reduced cost. Also, this branch has an objective function value better than the lower bound at root. Note that the negative column in question was not selected in the decimal solution in that branch.

Currently, I am looking at the root node to figure out why the the negative column found after branching was not found at the root.
For example, that negative column found after branching was ['Source', 14, 10, 12, 6, 1, 8, 2, 3, 9, 13, 5, 4, 7, 11, 'Sink']. (When it was found after branching, the duals were different to the root duals. Still, this route had a negative reduced cost w.r.t the root duals.)
At the root, cspy only searches up to ['Source', 14, 10, 12, 6, 1, 8, 2] and doesn't explore further.

I am trying to trace the bug to the exact place in the cspy but it's a bit difficult as the cspy lives in C++.
I'm calling 'logger.info' on many variables at each REF call back but I feel I could use a better debugging method.
Do you have any suggestions?

We talked before that maybe it's because the edge cost is a function of the resource state at the last label.
But I actually don't think so because that negative column was found for a battery electric vehicle type, where the edge cost is the same regardless of the resource state.
(Edge costs change only for a hybrid EV type, because the energy cost depends on whether it uses expensive fuel or cheaper battery.)

I know this is a lot to unpack, but I'd love to hear your opinion.

Thank you as always!

Soomin

@torressa
Copy link
Collaborator

torressa commented Apr 8, 2022

Hi again @soomin-woo! Thanks for checking this.
I will reply in the cspy repo (torressa/cspy#98).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants