Decision Optimization

Decision Optimization

Delivers prescriptive analytics capabilities and decision intelligence to improve decision-making.

 View Only
  • 1.  Tighting the lower bound in MIP?

    Posted Fri November 30, 2012 01:50 PM

    Originally posted by: SystemAdmin


    Hi,
    I am solving a big MIP problem in minimization form.

    I would like to set a given lower bound during the B&B process seen as best node column in log file. That is, I do not want the nodes be discovered further if their objective values are less than x. This is due to the fact that I know where roughly is the solution.

    I know that cutoff can be used but cutoff is only computed from the objective value of an incumbent solution which is integer feasible. I want to work out with the best node and put a lower bound on that.

    I have used obj>=x as one of my constraints but I can see that after for example two days still my best node is x without improvement. I think there should be a way to avoid extra branching and prune a node if its objective value (not the objective value of incumbent integer solution) is less than x.

    I also have used MIPemphasis option 3 to move the bound but no improvement.

    I would be glad if someone can shed a light on this please.

    Thanks.
    Nodes Cuts/
    Node Left Objective IInf Best Integer Best Bound ItCnt Gap

    0+ 0 4514.1307 x 7479 70.01% <- This value has not changed alot for like two days
    #CPLEXOptimizers
    #DecisionOptimization


  • 2.  Re: Tighting the lower bound in MIP?

    Posted Fri November 30, 2012 05:28 PM

    Originally posted by: SystemAdmin


    If I understand you correctly, you know (or you are at least pretty sure) that there can not be any solution of your minimization problem that is better (i.e., smaller) than some value B. You want to provide this value to CPLEX such that it can make good use of it in order to speed up the search process.

    Unfortunately, a lower bound will not help much. The global dual bound is really of not much use during the search process. The only use it has is to trigger a gap based abort criterion.

    Your hope that one can prune nodes with dual bound smaller than B is not justified. Namely, it could very well be that the optimal solution is in the sub tree rooted by such a node. In particular, the root node of the search tree will be such a node; why are you hoping that you can prune the root node?

    The only two places where the dual bounds of the nodes are used in the algorithm are for pruning the node (when the node's dual bound exceeds the cutoff obtained from the primal bound) and for the node selection. The former does not apply in your case. So, the only somewhat reasonable thing to do with a user provided global dual bound information (besides the abort criterion) would be to prefer selecting nodes that have a dual bound that is close to B, instead of search in areas of the search tree where the dual bound is smaller. But I doubt that this will help much.

    You already tried the obvious approach, namely to add c*x >= B as constraint. This has already been discussed multiple times in this forum. The summary is that such a constraint will typically not help at all. In contrast, it usually degrades performance significantly. The reason is that then all nodes will have the same dual bound, and CPLEX can no longer measure the impact of branching decisions by tracking the objective change that they triggered (so-called "pseudo cost branching"). Hence, branching will basically be random, and the search tree will usually get much bigger due to poor branching decisions.

    Tobias
    #CPLEXOptimizers
    #DecisionOptimization


  • 3.  Re: Tighting the lower bound in MIP?

    Posted Fri November 30, 2012 05:31 PM

    Originally posted by: SystemAdmin


    A more promising approach for incorporating your problem knowledge would be to derive cutting planes that drive the dual bound closer to the value that you think is a valid bound. One way of identifying such cutting planes is to add a constraint c*x <= B - eps, so that the model becomes infeasible. Now you call the conflict refiner to extract a small subproblem that already proves that c*x must be at least B. By looking at this subproblem, you may get some idea of how useful cutting planes can be derived.

    But note that this approach will only work for models of small or medium size, because otherwise proving that the model with the cutoff is infeasible will take too long, and the conflict refiner will typically take even longer. But maybe you have a small version of your model that has the same issue (a poor LP relaxation), and you can work with this one to identify cuts.

    Tobias
    #CPLEXOptimizers
    #DecisionOptimization


  • 4.  Re: Tighting the lower bound in MIP?

    Posted Mon December 03, 2012 06:39 AM

    Originally posted by: SystemAdmin


    Hi
    Thanks for your suggestion. I try to implement what you have suggested.
    But is there any option or way I can try to help this slow movement (like 5 days so far) convergence? in one day my lower bound (min problem) just moved by a value of 100 (!).

    Thanks again
    #CPLEXOptimizers
    #DecisionOptimization


  • 5.  Re: Tighting the lower bound in MIP?

    Posted Mon December 03, 2012 06:45 AM

    Originally posted by: SystemAdmin


    Of course, there is always the option of trying to find better parameter settings. You could use the tuning tool to automate this process (probably you want to run it on a set of easier problem instances of the same problem class).

    But typically, the by far most significant improvements to the solving time can be made by revising the model. Maybe you can use a different set of variables and constraints to model your problem which yields smaller problem instances or instances with tighter LP bound. Maybe you can just ignore certain constraints or model them in some aggregate or imprecise way if this would still yield reasonable solutions for the practical problem.

    There is no general "how to improve your model" recipe that one can provide. Modeling and finding the right trade-off between correctness of the model and solvability of the problem instances really is an art.
    Tobias
    #CPLEXOptimizers
    #DecisionOptimization


  • 6.  Re: Tighting the lower bound in MIP?

    Posted Mon December 03, 2012 07:16 AM

    Originally posted by: SystemAdmin


    Thanks Tobias.
    But I was wondering whether for example some special cuts are useful or even turning the cuts option and changng to traditional branch and bound instead of branch and cut may be suggested in these situations? or even solving the problem with opportunistic option might brng some chance and these sort of thing.
    #CPLEXOptimizers
    #DecisionOptimization


  • 7.  Re: Tighting the lower bound in MIP?

    Posted Wed December 05, 2012 05:16 PM

    Originally posted by: SystemAdmin


    This is hard to say. There are different reasons why CPLEX can have trouble to solve a given MIP model. For example, it could be that the LP relaxation is hard to solve, even from a warm start basis. This would lead to slow node throughput, and turning off cuts could help. Or it could be that the LP relaxation is weak (it provides only a poor dual bound). Then it could help to use cuts more aggressively.

    If you post (part of) your log file, then I could potentially give some more useful hints. Please make sure to enclose your log file in "code" tags, so that it gets printed verbatim (see the "Plain Text Help" at the right of the editing web page).

    Tobias
    #CPLEXOptimizers
    #DecisionOptimization