Note that if your tolerance for suboptimality is 0, i.e., you are really looking for alternate optimal solutions, then you can have a much faster algorithm.

In theory, for LPs you can replace step 2 with fixing all variables with nonzero reduced cost to their current bounds (they must be out-of-basis if their reduced cost is nonzero). That will ensure you will stay on the optimal face, so then you can just go to step 3. However, that might not be stable enough numerically, so it might be best to combine the two approaches:

1) solve the original LP

2) fix variables whose reduced cost is well away from 0

3) add a constraint of the form objective >= z* (if maximizing) or objective <= z* (if minimizing) (I have left off the epsilon Paul has suggested because the assumption is that your tolerance for suboptimality is 0, and anyway, by default there is a feasibility tolerance)

4) repeatedly change the objective function to something new and reoptimize.

--Laci

------------------------------

Laszlo Ladanyi

------------------------------

Original Message:

Sent: Thu May 07, 2020 04:55 PM

From: Paul Rubin

Subject: Finding alternative solutions to a MILP

There is not a built-in way to do this for LPs. If you just want to find some (as opposed to all) optimal or near-optimal solutions, you can do the following:

- solve the original LP and get the optimal solution and objective value z*;
- add a constraint of the form objective >= z* - epsilon (if maximizing) or objective <= z* + epsilon (if minimizing), where epsilon is your tolerance for suboptimality;
- repeatedly change the objective function to something new and reoptimize.

Step 3 is intentionally a bit vague. You can maximize and then minimize each individual variable. You can generate random objective functions and maximize and minimize them. You can mess around with dual variables and constraint ranges to adjust the objective coefficients. With all these approaches, you may generate the same solution more than once, so you need to check for duplicates.

Paul

------------------------------

Paul Rubin

Original Message:

Sent: Thu May 07, 2020 03:07 PM

From: O. A.

Subject: Finding alternative solutions to a MILP

Dear Daniel,

Thanks for your guidance. I will try to use the populate feature.

I was also wondering whether there is a similar feature for LPs instead of MIPs?

Many thanks.

------------------------------

O. A.

Original Message:

Sent: Thu May 07, 2020 06:11 AM

From: Daniel Junglas

Subject: Finding alternative solutions to a MILP

Are you aware that CPLEX has a feature to do exactly this? This is called populate and you can use it to enumerate for example all optimal solutions. You can find instructions here: https://www.ibm.com/support/knowledgecenter/SSSA5P_12.10.0/ilog.odms.cplex.help/CPLEX/UsrMan/topics/discr_optim/soln_pool/18_howTo.html

Other than that, callbacks (legacy lazy constraint callback or generic callback with CANDIDATE context) would be the correct option but things may not be as easy as you may think: Once you have accepted a solution with objective value V, cplex will no longer look for solutions that are not better than V. So you will have to artificially reject the solution from the callback and keep track of it yourself.

------------------------------

Daniel Junglas

Original Message:

Sent: Wed May 06, 2020 10:18 PM

From: O. A.

Subject: Finding alternative solutions to a MILP

Hi,

I am looking into finding alternative optimal (or near-optimal) solutions to a MILP by adding combinatorial Benders type cuts that eliminate an optimal solution once it is found.

I was wondering what method is the most efficient for achieving this using Python API? Should I be using callbacks to add these cuts, or is there some other method that I should be using for this purpose?

Many thanks for your help in advance.

------------------------------

O. A.

------------------------------