Decision Optimization

Decision Optimization

Delivers prescriptive analytics capabilities and decision intelligence to improve decision-making.

 View Only
  • 1.  reduced cost of basic optimal variables in a minimization problem

    Posted Tue April 03, 2018 01:50 PM

    Originally posted by: lbr33


    Why does a a basic optimal variable have reduced cost positive in a minimization problem?

    The lp model follows attached.

    I'm worried with lambdas vairables.

    When I solve this model directly by the terminal using cplex, I can get this information :

     

    Display which part of the solution: reduced

    Display reduced costs for which variable(s): lambda(1)

    Variable Name Reduced Cost lambda(1) -4199562.000000

     

    CPLEX> display solution reduced

    Display reduced costs for which variable(s): lambda(2)

    The reduced cost 'lambda(2)' is 0.

     

    CPLEX> display solution reduced

    Display reduced costs for which variable(s): lambda(3)

    Variable Name Reduced Cost lambda(3) 12599562.000000

     

    CPLEX> display solution variables

    Display values of which variable(s): lambda(1)

    Variable Name Solution Value lambda(1) 1.000000

     

    CPLEX> display solution variables

    Display values of which variable(s): lambda(2)

    The variable 'lambda(2)' is 0.

    CPLEX> display solution variables

    Display values of which variable(s): lambda(3)

    The variable 'lambda(3)' is 0.

     

    Is that ok?

    As lambda(1) is equal 1, I thought this variable must have reduced cost equals zero.

     


    #CPLEXOptimizers
    #DecisionOptimization


  • 2.  Re: reduced cost of basic optimal variables in a minimization problem

    Posted Tue April 03, 2018 03:34 PM

    A basic variable will have reduced cost 0. A variable an have a positive (or negative, if appropriate) value without being basic, however, if it is at its lower or upper bound. In your case, lambda(1) has an upper bound of 1, which it attains in the solution. It is nonetheless nonbasic, allowing a nonzero reduced cost (which functions the way the dual value would if the upper bound were entered as an ordinary constraint).


    #CPLEXOptimizers
    #DecisionOptimization


  • 3.  Re: reduced cost of basic optimal variables in a minimization problem

    Posted Tue April 03, 2018 04:07 PM

    Originally posted by: lbr33


    Thank you for your answer.

     

    But I didn't get one thing.

    What is the stopping criterion of the Simplex? In a minimization problem, I thought that it was having all reduced cost of non basic variables as positive values (meaning that we can't change the basic and non basic variables to improve the objective function).

     

    In this case, I can clearly see that it is not the case. In the optimal solution, we have non basic variables with negative reduced cost.


    #CPLEXOptimizers
    #DecisionOptimization


  • 4.  Re: reduced cost of basic optimal variables in a minimization problem

    Posted Tue April 03, 2018 04:28 PM

    The stopping criterion you mentioned is for the original simplex method, in which variable bounds had to be added as functional constraints (new rows in the tableau). The original algorithm was extended (a long time ago) to deal with bounds more efficiently. Originally, a variable could only be nonbasic at its lower bound (required to be zero). Now it can be nonbasic at either its lower or upper bound (provided the upper bound is finite).

    The negative reduced cost of lambda(1) means that the objective function would improve if the value of lambda(1) were increased. It's already at its upper bound, though, and cannot increase any further. So that does not stop the algorithm for terminating.

    Any modern text on linear programming should include treatment of the bound adaptation to the simplex method. You can also google "simplex algorithm bounded variable". (I'm not aware of any universally used name, like "modified simplex algorithm", for the adaptation.)


    #CPLEXOptimizers
    #DecisionOptimization


  • 5.  Re: reduced cost of basic optimal variables in a minimization problem

    Posted Tue April 03, 2018 04:49 PM

    Originally posted by: lbr33


    Thank you again for your clear explanation.

     

    Talking about Column Generation, is that the justification for even an active column (the variable corresponding to that column has value bigger than zero, more than that, has value equal its upper bound) being found in a pricing subproblem?

    What I mean is:

    a) Supose that I have a pricing subproblem that finds all column with negative reduced cost in that iteration.
    b) Suppose an specific column is found in iteration i.
    c) In iteration i+1, the variable corresponding to that column has value equal its upper bound.
    d) Then, a priori, I thought that this column couldn't be found in iteration i+2. In other words, this column wouldn't have negative reduced cost in the iteration after assuming its biggest value.
    But, now, I understand that negative reduced cost not necessarily means that the solution can be improve (we have to respect the bounds of the variables).

    Does that make sense?
     


    #CPLEXOptimizers
    #DecisionOptimization


  • 6.  Re: reduced cost of basic optimal variables in a minimization problem

    Posted Fri April 06, 2018 05:37 PM

    I don't know that I've ever actually used column generation, but that sounds correct to me.


    #CPLEXOptimizers
    #DecisionOptimization