Decision Optimization

Decision Optimization

Delivers prescriptive analytics capabilities and decision intelligence to improve decision-making.

 View Only
Expand all | Collapse all

Using hard-drive memory with CPLEX

  • 1.  Using hard-drive memory with CPLEX

    Posted Mon May 20, 2013 10:06 AM

    Originally posted by: 5NK7_Thomas_Cibils


    Hi everyone

    I need your help again. I'm still a verry beginner, and now that I've fixed all coding-errors of my model, clex tells me that : "Error 1001 : Out of memory". I currently have 1'500 decision variables, but given what I've found on the internet, it seems to be still really few.

    Mabe it's because CPLEX is only using the RAM memory of the computer ? Is there a way to make it use the hard drive ? I've heard it could be done using Java, but I've never learned how to code with Java. Is it possible to set it through CPLEX's API ?

    I currently have 4Go of RAM, windows 7 64 bits, Intel core 2.7Ghz, and SSD drive 150Go.

    Thanks for your help


    #CPLEXOptimizers
    #DecisionOptimization


  • 2.  Re: Using hard-drive memory with CPLEX

    Posted Mon May 20, 2013 05:46 PM

    A variety of factors go into memory consumption, including number of constraints, number of nonzeros in the constraint matrix, whether the problem is continuous or a MILP, ... If you are using C or C++, it's possible that your code leaks memory.

    Does CPLEX manage to generate the problem and start solving it before you run out of  memory? If so, what does the log say about the number of variables, constraints and nonzeros?

    Paul


    #CPLEXOptimizers
    #DecisionOptimization


  • 3.  Re: Using hard-drive memory with CPLEX

    Posted Tue May 21, 2013 04:52 AM

    Originally posted by: 5NK7_Thomas_Cibils


    Hello

    Thanks for your help. I've found some temporary solutions. It is possible to create a "parameter file" (right click on the execution configuration, new, parameter). In this file, on "mathematical programation/general", one can set the available work memory (128 MegaBytes by default). On "PLNE/Strategy", I've set OPL to save the nodes compressed on the hard drive, instead of on the RAM. It helps.

    Now I have 2573 constraints, 1569 variables and 6119 non-zero coefficients. The point is, the optimisation seems to be stopped at 68%, after 200'000 iterations... The gap for the objective function, in the engine log, gets down to 3.80%, but then it stops. Do you have any idea about how I could fix it ?

    Thanks


    #CPLEXOptimizers
    #DecisionOptimization


  • 4.  Re: Using hard-drive memory with CPLEX

    Posted Tue May 21, 2013 03:59 PM

    68 of what?

    When you say "stopped", do you mean that the solver has stopped running, or that it is running but no further progress is being made?

     

    Paul


    #CPLEXOptimizers
    #DecisionOptimization


  • 5.  Re: Using hard-drive memory with CPLEX

    Posted Tue May 21, 2013 05:03 PM

    Originally posted by: 5NK7_Thomas_Cibils


    Hi,

    On the bar, down right of the window. See the picture : http://imageshack.us/a/img40/2353/sanstitrefjm.png (here at 66%)

    "Stopped" mean that CPLEX stops running, and gives me the error "out of memory".

    Thanks for your help


    #CPLEXOptimizers
    #DecisionOptimization


  • 6.  Re: Using hard-drive memory with CPLEX

    Posted Fri May 24, 2013 01:41 PM

    Originally posted by: 5NK7_Thomas_Cibils


    Hi,

    Now I have that problem quite often with several problems. I've found some informations :

    http://pic.dhe.ibm.com/infocenter/cosinfoc/v12r3/index.jsp?topic=%2Filog.odms.cplex.help%2FContent%2FOptimization%2FDocumentation%2FOptimization_Studio%2F_pubskel%2Fps_usrmancplex1862.html

    http://pic.dhe.ibm.com/infocenter/cosinfoc/v12r3/index.jsp?topic=%2Filog.odms.cplex.help%2FContent%2FOptimization%2FDocumentation%2FOptimization_Studio%2F_pubskel%2Fps_usrmancplex1708.html

    but it didn't help me to fix my problem.

     

    My model is not that big : 480'000 constraints, 78'000 variables. My computer is quite good : 2core at 2.7Ghz, 4Go RAM, windows 7 64bit, 150Go SSD hard drive. I think it can compute this problem, and I don't know why it doesn't.

    As said, I've give 163840.0Mb available memory for working storage, and I've set it to "Node storage file switch : on disk and compressed". Does anyone has any idea about how to give access to more memory to CPLEX ?

    Now, the engine log runs, and the gap goes down. But suddenly, it stops.

     

    Any idea, someone ?


    #CPLEXOptimizers
    #DecisionOptimization


  • 7.  Re: Using hard-drive memory with CPLEX

    Posted Sun May 26, 2013 08:20 PM

    The first thing I would do is get out of OPL Studio. It's a very nice IDE for designing/prototyping models, but it's not the best place to run production models. (I seem to remember a discussion along these lines, involving someone from IBM, in a session at an INFORMS conference a year or so ago.) The IDE slows execution down (in part because it "instruments" the model so that it can monitor progress) and in part because the IDE soaks up memory and CPU cycles itself.

    You should be able to export the problem to an LP or SAV file from the IDE. Run that in the CPLEX interactive optimizer and see if you still run out of memory.

    Paul


    #CPLEXOptimizers
    #DecisionOptimization


  • 8.  Re: Using hard-drive memory with CPLEX

    Posted Thu May 30, 2013 10:28 AM

    Originally posted by: 5NK7_Thomas_Cibils


    Hi,

    Thanks for the advice, but I don't know at all how to use any other interface :S Do you really think that it could affect a lot the running time or memory use of the program ?

    Thanks for your help anyway !


    #CPLEXOptimizers
    #DecisionOptimization


  • 9.  Re: Using hard-drive memory with CPLEX

    Posted Wed May 29, 2013 03:10 AM

    The issue in your case is not the size of the model but the size of the search tree (which is completely independent of the size of the model).

    I am unclear why you give 163840.0Mb (160 GB?) for working storage when you only have 4 GB of RAM. The amount of working storage should be smaller than the amount of RAM. Try using 3GB and set the node file switch to "on disk and compressed". Does this get you any further? If not, what is the error message you get?


    #CPLEXOptimizers
    #DecisionOptimization


  • 10.  Re: Using hard-drive memory with CPLEX

    Posted Thu May 30, 2013 10:26 AM

    Originally posted by: 5NK7_Thomas_Cibils


    Hi,

     

    I thought this Mb was some hard-drive memory allowed, so the point was to allow 163'840Mo = 20'480Mb = 20Gb. I've set it to 24'000Mo = 3'000Gb = 3Go, as you suggest. The node file switch is "on disk and compressed". 
    By the way : I've read that this parameter is in Mo and not Mb, is it ?

     

    With 24'000 for the parameter (gap gets down only to 97%) :

    Implied bound cuts applied:  71
    Flow cuts applied:  10639
    Mixed integer rounding cuts applied:  8
    Zero-half cuts applied:  11
    Gomory fractional cuts applied:  7
     
    Root node processing (before b&c):
      Real time             =  197.93
    Parallel b&c, 4 threads:
      Real time             = 1608.75
      Sync time (average)   =  255.96
      Wait time (average)   =  347.74
                              -------
    Total (root+branch&cut) = 1806.68 sec.
    Warning: MIP starts not constructed because of out-of-memory status.

     

    With 3'000 for the parameter (gap gets down to 54% !) :

    Implied bound cuts applied:  71
    Flow cuts applied:  10639
    Mixed integer rounding cuts applied:  8
    Zero-half cuts applied:  11
    Gomory fractional cuts applied:  7
     
    Root node processing (before b&c):
      Real time             =  182.04
    Parallel b&c, 4 threads:
      Real time             = 1683.60
      Sync time (average)   =  281.06
      Wait time (average)   =  418.32
                              -------
    Total (root+branch&cut) = 1865.63 sec.
    Warning: MIP starts not constructed because of out-of-memory status.
     

     

    Do you have any idea about how I could reduce the size of the tree, if it's independent of the size of the model ?


    #CPLEXOptimizers
    #DecisionOptimization


  • 11.  Re: Using hard-drive memory with CPLEX

    Posted Thu May 30, 2013 10:32 AM

    Originally posted by: 5NK7_Thomas_Cibils


    I've found more information here : http://www-01.ibm.com/support/docview.wss?uid=swg21400023

    Now I'm trying to find how to change the value of the "probing" setting. (All my decision variables are integers) Do you think it's a good idea ?


    #CPLEXOptimizers
    #DecisionOptimization


  • 12.  Re: Using hard-drive memory with CPLEX

    Posted Mon June 03, 2013 03:52 AM

    Originally posted by: TobiasAchterberg


    Let me first explain the "work memory" parameter. This parameter triggers when CPLEX should store nodes in a "node file". What exactly the node file is depends on the "node file" parameter. It could be uncompressed or compressed, and it could be in memory or on disk.

    The "work memory" parameter does not say how much memory CPLEX is allowed to use. It just defines the switching point where CPLEX uses a more memory-efficient way to store search tree nodes. This means, that it does not make any sense to set the work memory to a value larger or equal than the physical memory installed in your machine. Typically, the default value should be okay. The only reason why you would want to increase the work memory is to improve run-time performance a little bit (because storing nodes in the "node file" and getting them back to the regular node data structures involves some overhead).

    You can limit the total size of the search tree by a different parameter: the "tree memory limit" parameter (TreeMemory, CPX_PARAM_TRELIM). If the total size of the tree (regular nodes plus nodes stored in node file) exceeds this limit, then CPLEX will stop. For example, if you are storing the node file on disk and you want to avoid the node file to exceed 20 GB, then you should set the TreeMemory parameter to 20480.

    The out-of-memory errors that you are seeing come from the fact that CPLEX is trying to allocate memory, and the operating system is not able to provide this memory. For such a relatively small model, this is rather odd, because most of the consumed memory will be search tree nodes, and those can be written to disk (if you set the "node file" parameter to 2 or 3). Of course, as explained above, you should set the "work memory" parameter back to a much smaller value (like the default 128 MB) so that CPLEX moves nodes to the node file much earlier.

    But as far as I understand, your initial try was with default settings, and this ran out of memory as well. So, I am bit puzzled what is going on. The only strange thing in your log output above is the excessive number of flow cover cuts applied. Do you also run out of memory when you disable flow cover cuts, or when you disable all cutting planes?


    #CPLEXOptimizers
    #DecisionOptimization


  • 13.  Re: Using hard-drive memory with CPLEX

    Posted Mon June 03, 2013 11:17 AM

    Originally posted by: 5NK7_Thomas_Cibils


    Thanks for your explanations and your help! I've set the memory available for working storage back to the normal, at 128.0Mb, and set the TreeMemory parameter at 20480. The node file is on disk and compressed. But it still takes 1h10, and I have the "error 1013 : Bad parameter number to CPLEX parameter routine". Here is the end of the Engine log : 

     

    ------------------------------

     48000 47308  1697655.9907  1212  3613725.0000  1658101.8299  1307499   54.12%
    Elapsed real time = 4180.14 sec. (tree size = 1855.09 MB, solutions = 21)
    Nodefile size = 1725.89 MB (703.22 MB after compression)
      48100 47406  1697656.7012  1177  3613725.0000  1658101.8299  1311773   54.12%
      48200 47498  1697657.2879  1140  3613725.0000  1658101.8299  1320226   54.12%
      48300 47590  1690375.0869  7869  3613725.0000  1658101.8299  1324679   54.12%
      48400 47664  1697659.1664  1064  3613725.0000  1658101.8299  1333187   54.12%
      48500 47744  1697660.0519  1026  3613725.0000  1658101.8299  1337529   54.12%
      48600 47832  2852147.3044   604  3613725.0000  1658101.8299  1345988   54.12%
     
    There may be further error information in the clone logs.
     
    Implied bound cuts applied:  71
    Flow cuts applied:  10639
    Mixed integer rounding cuts applied:  8
    Zero-half cuts applied:  11
    Gomory fractional cuts applied:  7
     
    Root node processing (before b&c):
      Real time             =  190.35
    Parallel b&c, 4 threads:
      Real time             = 4043.89
      Sync time (average)   =  975.28
      Wait time (average)   = 1407.81
                              -------
    Total (root+branch&cut) = 4234.24 sec.
    Warning: MIP starts not constructed because of out-of-memory status.
     --------------

     

    This parameters didn't change anything... I've found another parameter, (cplex.memoryemphasis, CPX_PARAM_MEMORYEMPHASIS). The help file says that : "[This parameter] Directs CPLEX® that it should conserve memory where possible. When you set this parameter to its nondefault value, CPLEX® will choose tactics, such as data compression or disk storage, for some of the data computed by the simplex, barrier, and MIP optimizers."

    I'll try to set it to "true", mabe it will change something... Right after that, I'll try to disable flow cover cuts or/and disable all cutting planes.

     


    #CPLEXOptimizers
    #DecisionOptimization