Thank you Paul for your reply.

While supplying the corresponding vector x is a very good idea, it requires technical manipulation to find the vector x in both partitions since they are two different problems then supply it to their corresponding x in partition of cardinality 3.

While this remain as an option (together with the lazy or user constraints callback or just added as a constraint before starting the optimization process), I found a parameter in CPLEX called Upper cutoff. From the documentation , this parameter "

cuts off or discards any solutions that are greater than the specified upper cutoff value. If the model has no solution with an objective value less than or equal to the cutoff value, CPLEX declares the model infeasible. In other words, setting an upper cutoff value c for a minimization problem is similar to adding this constraint to the objective function of the model: `obj <= c .`

"

This is exactly what I want, but no further information are given on how or when it is used.

I assume this bound is used during presolve , and on every node to prune whose lower bound equals or exceeds this bound?

Thank you.

------------------------------

ISSA .

------------------------------

Original Message:

Sent: Tue January 03, 2023 05:16 PM

From: Paul Rubin

Subject: Infeasible MIP models

The effect of either an explicit constraint or lazy constraints is difficult to predict. In you first example (partitions of a sets of cardinality 3), suppose that your estimate b turns out to be the cost of the partition {{1,3},{2}}. Why not just supply the corresponding vector x as an initial feasible solution? That will let CPLEX prune any nodes whose lower bound equals or exceeds b, plus letting it make any possible use of that bound during presolve, and it does not expand the size of the model.

------------------------------

Paul Rubin

Professor Emeritus

Michigan State University

Original Message:

Sent: Tue January 03, 2023 06:41 AM

From: ISSA bou zeid

Subject: Infeasible MIP models

Thank you Paul for your reply.

For a given b, we have three cases :

1. b guessed too low, hence the modified problem is infeasible. However, we might miss the global optimal solution.

2. b guessed right, if P is feasible then we keep P, otherwise P is ignored.

3. b guessed too high, hence P is always feasible but not interested to keep.

For instance, supposing we have a set A = {1,2,3}. Given all partitions of G_{A} = {{{1}, {2}, {3}}, {{1, 2}, {3}} , {{1, 3}, {2}} , {{1}, {2, 3}}} each with a cost c_{i}, we can define a valid upper bound on the set {1,2,3} such that b = min (c_{{{1}, {2}, {3}}} , c_{{{1, 2}, {3}}} , c_{{{1, 3}, {2}}} , c_{{{1}, {2, 3}}}). Similarly, we can guess b for sets of size two, for example the cost of {1,2} <= c_{{1} } + c_{{2}}. Given this, b is always guessed right for problem P.

How about adding this constraint as a lazy constraint ( or user cut) through the cut callback, or set it up in a cut pool before the MIP optimization begins? Or do you think there's a more suitable way to use this information?

I am interested in using this information I have on the cost of P in the most efficient way to improve the solution time of larger instances.

Thank you, and Happy new year.

------------------------------

ISSA bou zeid

Original Message:

Sent: Tue December 27, 2022 06:12 PM

From: Paul Rubin

Subject: Infeasible MIP models

Just to be clear, b is not necessarily a valid upper bound (meaning you might guess too low, in which case the modified problem would be infeasible)?

In the case where P is infeasible, CPLEX might prove it in the presolve stage or might prove it by exhausting the tree. It would most likely not have to through all nodes of the full binary search tree; some nodes above the bottom level of the tree would likely be detected as infeasible. Whether infeasibility would be detected during presolve would be an empirical question (meaning the only way to know would be to try), and I would not be at all surprised if some problem instances were found infeasible in presolve and some were found infeasible during branching.

As for feasible instances, adding the constraint is very unlikely to help and could hurt solution time. When a constraint and the objective function have the same coefficient vector, I believe the LP relaxations tend to suffer from dual degeneracy, which can slow down the dual simplex algorithm.

------------------------------

Paul Rubin

Professor Emeritus

Michigan State University

Original Message:

Sent: Tue December 27, 2022 04:34 PM

From: ISSA bou zeid

Subject: Infeasible MIP models

Hello everyone,

Let's suppose we have a minimization problem P defined as min{ cx | Ax = e, x_{i} ∈ {0,1} ∀ i ∈ N}, where A is an m × n matrix of zeros and ones, c is an arbitrary cost n-vector, e is an m-vector, and N={1, ..., n}.

I have an idea on an upper-bound cost for P. Meaning, let's suppose b is the upper-bound cost, I can add the following constraint cx ≤ b. Hence, with this given constraint P can be infeasible.

Lets say I have thousand of problems to be solved, and I am interested in finding immediately the infeasible problems and stop the resolution right away to save time.

So first, I am interested in knowing how would CPLEX prove P to be infeasible, is it in presolve or does it go through all nodes of the tree?

Second, Is there any interest to add such constraint ? And by adding it, does the resolution time become faster? if yes, how can we prove it ?

Thank you for your help.

------------------------------

ISSA bou zeid

------------------------------