Decision Optimization

Decision Optimization

Delivers prescriptive analytics capabilities and decision intelligence to improve decision-making.

 View Only
Expand all | Collapse all

Segmentation fault (core dumped) (For large data)

ALEX FLEISCHER

ALEX FLEISCHERThu December 18, 2014 04:31 PM

Archive User

Archive UserMon December 22, 2014 02:38 PM

Archive User

Archive UserSun December 28, 2014 05:05 AM

Archive User

Archive UserSun December 28, 2014 07:08 AM

Archive User

Archive UserSun December 28, 2014 09:01 AM

Archive User

Archive UserWed December 31, 2014 09:08 AM

Archive User

Archive UserFri January 02, 2015 07:36 PM

Archive User

Archive UserThu January 08, 2015 04:58 AM

Archive User

Archive UserThu January 08, 2015 05:14 PM

Archive User

Archive UserFri January 09, 2015 04:32 PM

  • 1.  Segmentation fault (core dumped) (For large data)

    Posted Thu December 18, 2014 06:20 AM

    Originally posted by: AMurshed


    Hello,

    I am trying to solve a cost function for Ethernet using CPLEX MIPS.

    The program works and gives correct results for small data (10 nodes, 4 jobs and 4 messages)

    The problem is that when I increase the number of jobs more it crashes and gives me 'Segmentation fault (core dumped)'.

    Is there a way to fix the program or because CPLEX can not work for large data?

     

    Thanks in advance,

    Ayman Murshed


    #CPLEXOptimizers
    #DecisionOptimization


  • 2.  Re: Segmentation fault (core dumped) (For large data)

    Posted Thu December 18, 2014 04:31 PM

    Hi,

    can you attach a .sav you get a seg fault with?

    regards


    #CPLEXOptimizers
    #DecisionOptimization


  • 3.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri December 19, 2014 04:46 AM

    Originally posted by: AMurshed


    Thanks Mr. Alex for your reply.

    Attached is the cumulative.lp file for the small network with  the following input:

     

    Jobs, messages, nodes[4,4,7]

    network topology
    [[0,1,1,1,0,0,1],
     [1,0,0,0,0,0,0],
     [1,0,0,0,0,0,0],
     [1,0,0,0,1,1,1],
     [0,0,0,1,0,0,0],
     [0,0,0,1,0,0,0],
     [1,0,0,1,0,0,0]]

    sending jobs

     [0,0,1,2]     

    recieving jobs
    [[0, 1, 0, 0],
     [0, 0, 1, 0],
     [0, 0, 0, 1],
     [0, 1, 0, 0]]

    Minimum arrival time
    [50,50,50,50]

    duration of hop-to-hop transmission
    [3,3,3,3]

    execution time for job
    [2,2,2,2]

    allocatability of nodes
    [0,1,1,0,1,1,1]

     

     

    The problem is that when the number of jobs or messages or nodes increased, it gives a segmentation fault.

     

    Thanks in advence,

    Best regards,


    #CPLEXOptimizers
    #DecisionOptimization


  • 4.  Re: Segmentation fault (core dumped) (For large data)

    Posted Mon December 22, 2014 02:38 PM

    Originally posted by: EdKlotz


    Ayman,

                 I'm not clear regarding whether you attached an LP file that crashed, or the smaller model that ran without problem.    Regardless, I just ran your attached model with CPLEX 12.6.1 and default settings, and it finished the model with no problems.   Can you provide the CPLEX version you are using, and also confirm that the attached file is indeed the one on which you encountered the crash?   Also, when you encounter the crash, how are you using CPLEX?   Are you just using the interactive optimizer (i.e. the cplex executable), or are you running it from your own program?   If the latter, which API are you using?

     

                                                                                                                                                                                                                       Ed


    #CPLEXOptimizers
    #DecisionOptimization


  • 5.  Re: Segmentation fault (core dumped) (For large data)

    Posted Thu December 25, 2014 07:47 AM

    Originally posted by: A_Murshed


    Thanks EdKlotz for your reply.

    Actually, the attached lp file is for small network and it's working fine with execution time =340 seconds.

    The problem when I input large network (for example)

    Jobs, messages, nodes[9,9,14]

    network topology
    [[0,1,1,1,1,1,1,0,0,0,1,0,0,0],
     [1,0,0,0,0,0,0,0,0,0,0,0,0,0],
     [1,0,0,0,0,0,0,0,0,0,0,0,0,0],
     [1,0,0,0,0,0,0,0,0,0,0,0,0,0],
     [1,0,0,0,0,0,0,0,0,0,0,0,0,0],
     [1,0,0,0,0,0,0,0,0,0,0,0,0,0],
     [1,0,0,0,0,0,0,1,1,1,1,0,0,0],
     [0,0,0,0,0,0,1,0,0,0,0,0,0,0],
     [0,0,0,0,0,0,1,0,0,0,0,0,0,0],
     [0,0,0,0,0,0,1,0,0,0,0,0,0,0],
     [1,0,0,0,0,0,1,0,0,0,0,1,1,1],
     [0,0,0,0,0,0,0,0,0,0,1,0,0,0],
     [0,0,0,0,0,0,0,0,0,0,1,0,0,0],
     [0,0,0,0,0,0,0,0,0,0,1,0,0,0]]

    sending jobs

    [1,2,3,4,5,6,7,8,8]

    recieving jobs
    [[1,0,0,0,0,0,0,0,0],
     [1,0,0,0,0,0,0,0,0],
     [0,1,0,0,0,0,0,0,0],
     [0,0,0,1,0,0,0,0,0],
     [1,0,0,0,0,0,0,0,0],
     [0,0,0,1,0,0,0,0,0],
     [0,0,0,0,0,0,1,0,0],
     [0,0,0,0,0,0,1,0,0],
     [0,0,0,0,0,0,0,1,0]]

    Minimum arrival time
    [80,80,80,80,80,80,80,80,80]

    duration of hop-to-hop transmission
    [2,2,2,2,2,2,2,2,2]

    execution time for job
    [1,1,1,1,1,1,1,1,1]

    allocatability of nodes
    [0,1,1,1,1,1,0,1,1,1,0,1,1,1]

    It crashes before solving the model and doesn't create lp file.

     

    I am using CPLEX enterprise Server 12.6 on my university (Siegen University) Unix Server.

     

    Best regards,

    Ayman Murshed

     


    #CPLEXOptimizers
    #DecisionOptimization


  • 6.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri December 26, 2014 05:55 AM

    Hi,

    if you use CPLEX Enterprise Server, can you attach the .mod and .dat files you use to get this SIGSERV?

    regards


    #CPLEXOptimizers
    #DecisionOptimization


  • 7.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri December 26, 2014 01:09 PM

    Originally posted by: A_Murshed


    Hello,

    I am sorry, I don't know what are those files (.mod and .dat).

    I used c++ code format to write CPLEX program.

    Is it necessary to attach the C++ code. The work in this code is not published yet and I am really afraid to post the code here.

    Is there another way to solve the problem?

     

    Regards,


    #CPLEXOptimizers
    #DecisionOptimization


  • 8.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri December 26, 2014 01:25 PM

    Originally posted by: T_O


    Are you sure that the segmentation fault comes from CPLEX and not from your C++ code? Did you try to compile with debugging symbols and run the whole thing inside a debugger?

    Best regards,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 9.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri December 26, 2014 07:12 PM

    Originally posted by: A_Murshed


    Thanks Thomas for your reply,

    Yes, the program is working very well and produce correct results for small network topology (attached an lp file in my second reply).

    The problem is that when I increase the input arrays of the network topology.

    Regards,


    #CPLEXOptimizers
    #DecisionOptimization


  • 10.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri December 26, 2014 10:10 PM

    Originally posted by: T_O


    Unfortunately, this does not mean anything. If your memory allocation routine is erroneous, anything might happen. So please run your program through a debugger and check where the segmentation fault comes from. You might also try to set the datacheck-parameter to 1.

    Best regards,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 11.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sat December 27, 2014 05:44 AM

    Originally posted by: A_Murshed


    This is part of the code for Numerical arrays and variables arrays:

    ///////////////////////////////////////////////////////////////

    #include "ilcplex/ilocplex.h"
    ILOSTLBEGIN

    typedef IloArray<IloBoolVarArray>  BoolVarray2dim;
    typedef IloArray<IloNumVarArray>   NumVarray2dim;
    typedef IloArray<IloNumArray>   NumArray2dim;


    int
    main(int argc, char **argv)
    {
       IloEnv env;
       try {

         ///////////Constants//////////

    const char* filename;     

         IloNumArray JMR(env);  //Network Dimension (Jobs, Messages, Routers)
         NumArray2dim CONN(env);               //CONN = Connection    
         IloNumArray SENDER(env);  // Source Job of Messages         
         NumArray2dim DEST(env);               //DEST = Destination         
         IloNumArray MINT(env);  // Minimum Interarrival time of Sporadic Msg., period of periodic msg.
         IloNumArray DUR(env);    // Duration of message transmission (depending on message lenght).
         IloNumArray EXEC(env);   // Computational time of job
         IloBoolArray ALLOCABILITY(env);  //Allocability array

         filename = "Data/10N.dat";
          ifstream f(filename, ios::in);
          if (!f) {
             cerr << "No such file: " << filename << endl;
             throw(1);
          }

          f >> JMR;
          f >> CONN;
          f >> SENDER;
          f >> DEST;      
          f >> MINT;
          f >> DUR;
          f >> EXEC;
          f >> ALLOCABILITY;



         ////////Decision Variables///////
         IloNumVarArray ALLOC(env,JMR[0], 0,JMR[2],ILOINT);  // Allocation of jobs to cores (one core connected to one router)     
         IloNumVarArray   HOPS(env, JMR[1], 0, JMR[2], ILOINT);               // Number of hobs for each message
         
         NumVarray2dim PATH(env,JMR[1]);         // Each row is the path for a message.
         for(IloInt i =0; i<JMR[1];++i)
             PATH[i]= IloNumVarArray(env,JMR[2], 0, JMR[2], ILOINT);
         
        BoolVarray2dim ALLOCMATRIX(env,JMR[0]);         // Allocation of jobs to cores.
         for(IloInt i =0; i<JMR[2];++i)
             ALLOCMATRIX[i]= IloBoolVarArray(env,JMR[2]);

        IloNumVarArray INJECTIME(env,JMR[1], 0,MINT[0],ILOINT);
       
         
         NumVarray2dim ONPATH(env,JMR[1]); //ONPATH = Message meets router     
         for(IloInt i=0;i<JMR[1];++i)
             ONPATH[i]= IloNumVarArray(env,JMR[2], 0,1, ILOINT);
     
         IloModel modl(env);
         IloCplex cplex(env);

    //Objective ////MINIMIZE cost////////
     IloExpr objExp(env,0);
          for(IloInt m=0;m<JMR[1];++m)
             objExp += INJECTIME[m] + HOPS[m]*DUR[m];      
          modl.add(IloMinimize(env,objExp));
          objExp.end();


    #CPLEXOptimizers
    #DecisionOptimization


  • 12.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sat December 27, 2014 06:52 AM

    Originally posted by: T_O


    Please use a debugger (e.g. gdb) to see where the segmentation fault comes from. It is really easy.

    g++ -g [...]
    gdb [executable]
    r [optinal parameters of the executable]
    (wait for segmentation fault)
    bt

    Best regards,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 13.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sat December 27, 2014 09:58 AM

    Originally posted by: A_Murshed


    This is the output of the suggested debug:

     

    (gdb) r
    Starting program: /home/murshed/cplex_test/test
    [Thread debugging using libthread_db enabled]
    Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".

    Program received signal SIGSEGV, Segmentation fault.
    0x000000000040931e in main ()

    (gdb) bt
    #0  0x000000000040931e in main ()


    #CPLEXOptimizers
    #DecisionOptimization


  • 14.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sat December 27, 2014 11:51 PM

    Originally posted by: T_O


    Please recompile with debugging symbols (-g) and without optimization. Then try again.

    Best regars,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 15.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 05:05 AM

    Originally posted by: A_Murshed


    Hello ,

    I used the command:

    make --debug [executable]

    murshed@opnet:~/cplex_test$ gdb test
    GNU gdb (Ubuntu 7.7.1-0ubuntu5~14.04.2) 7.7.1
    Copyright (C) 2014 Free Software Foundation, Inc.
    License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
    This is free software: you are free to change and redistribute it.
    There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
    and "show warranty" for details.
    This GDB was configured as "x86_64-linux-gnu".
    Type "show configuration" for configuration details.
    For bug reporting instructions, please see:
    <http://www.gnu.org/software/gdb/bugs/>.
    Find the GDB manual and other documentation resources online at:
    <http://www.gnu.org/software/gdb/documentation/>.
    For help, type "help".
    Type "apropos word" to search for commands related to "word"...
    Reading symbols from test...(no debugging symbols found)...done.
    (gdb) r
    Starting program: /home/murshed/cplex_test/test
    [Thread debugging using libthread_db enabled]
    Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".

    Program received signal SIGSEGV, Segmentation fault.
    0x000000000040931e in main ()
    (gdb) bt
    #0  0x000000000040931e in main ()
     

     


    #CPLEXOptimizers
    #DecisionOptimization


  • 16.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 05:57 AM

    Originally posted by: T_O


    As far as I know, --debug does not have anything to do with debugging symbols. Can you try something like

    CFLAGS="-g -O0" CXXFLAGS="-g -O0" make

    Or even better: Post your makefile.

    Best regards,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 17.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 06:29 AM

    Originally posted by: A_Murshed


    Attached, makefile.


    #CPLEXOptimizers
    #DecisionOptimization


  • 18.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 06:36 AM

    Originally posted by: A_Murshed


    I tried also the parameters you suggested, but gave the same results.


    #CPLEXOptimizers
    #DecisionOptimization


  • 19.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 07:08 AM

    Originally posted by: T_O


    Thanks. In your makefile, replace

    CCOPT =  -O -fPIC -fexceptions -DNDEBUG -DIL_STD

    by

    CCOPT =  -g -fPIC -fexceptions -DIL_STD

    and try again.

    Best regards,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 20.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 08:19 AM

    Originally posted by: A_Murshed


    Thanks Mr. Thomas for your reply,

    That's the output after modifying the makefile:

     

    murshed@opnet:~/cplex_test$ gdb test
    GNU gdb (Ubuntu 7.7.1-0ubuntu5~14.04.2) 7.7.1
    Copyright (C) 2014 Free Software Foundation, Inc.
    License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
    This is free software: you are free to change and redistribute it.
    There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
    and "show warranty" for details.
    This GDB was configured as "x86_64-linux-gnu".
    Type "show configuration" for configuration details.
    For bug reporting instructions, please see:
    <http://www.gnu.org/software/gdb/bugs/>.
    Find the GDB manual and other documentation resources online at:
    <http://www.gnu.org/software/gdb/documentation/>.
    For help, type "help".
    Type "apropos word" to search for commands related to "word"...
    Reading symbols from test...done.


    (gdb) r
    Starting program: /home/murshed/cplex_test/test
    [Thread debugging using libthread_db enabled]
    Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
    X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array
    test: include/ilconcert/iloenv.h:2219: X& IloArray<X>::operator[](IloInt) [with X = IloBoolVarArray; IloInt = long int]: Assertion `(i < _impl->getSize()) || (std:: cerr << "X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array" << std:: endl, ilo_stop_assert())' failed.

    Program received signal SIGABRT, Aborted.
    0x00007ffff700cbb9 in __GI_raise (sig=sig@entry=6)
        at ../nptl/sysdeps/unix/sysv/linux/raise.c:56
    56      ../nptl/sysdeps/unix/sysv/linux/raise.c: No such file or directory.
    (gdb) bt
    #0  0x00007ffff700cbb9 in __GI_raise (sig=sig@entry=6)
        at ../nptl/sysdeps/unix/sysv/linux/raise.c:56
    #1  0x00007ffff700ffc8 in __GI_abort () at abort.c:89
    #2  0x00007ffff7005a76 in __assert_fail_base (
        fmt=0x7ffff71572b0 "%s%s%s:%u: %s%sAssertion `%s' failed.\n%n",
        assertion=assertion@entry=0xfbf228 "(i < _impl->getSize()) || (std:: cerr << \"X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array\" << std:: endl, ilo_stop_assert())",
        file=file@entry=0xfbe589 "include/ilconcert/iloenv.h",
        line=line@entry=2219,
        function=function@entry=0xfbf920 <IloArray<IloBoolVarArray>::operator[](long)::__PRETTY_FUNCTION__> "X& IloArray<X>::operator[](IloInt) [with X = IloBoolVarArray; IloInt = long int]") at assert.c:92
    #3  0x00007ffff7005b22 in __GI___assert_fail (
        assertion=0xfbf228 "(i < _impl->getSize()) || (std:: cerr << \"X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array\" << std:: endl, ilo_stop_assert())",
        file=0xfbe589 "include/ilconcert/iloenv.h", line=2219,
        function=0xfbf920 <IloArray<IloBoolVarArray>::operator[](long)::__PRETTY_FUNCTION__> "X& IloArray<X>::operator[](IloInt) [with X = IloBoolVarArray; IloInt = long int]") at assert.c:101
    #4  0x000000000040f6fe in IloArray<IloBoolVarArray>::operator[] (
        this=0x7fffffffe020, i=8) at include/ilconcert/iloenv.h:2219
     

     

    Regards,

     


    #CPLEXOptimizers
    #DecisionOptimization


  • 21.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 08:23 AM

    Originally posted by: A_Murshed


    By the way, it gives me this output even if I put small input values not only for large input values.


    #CPLEXOptimizers
    #DecisionOptimization


  • 22.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 08:40 AM

    Originally posted by: T_O


    This is what I guessed. Some array dimensions mismatch and in some runs, you just seem to be lucky not to run into trouble.

    Is there some more output? Or did you write down all lines? What happens if you type

    where

    instead of

    bt

    ? We still have not found the root of your problem. Could you also check the dimensions in your input files?

    Best regards,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 23.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 09:01 AM

    Originally posted by: A_Murshed


    Thanks Mr. Thomas,

    I think all arrays are well defined:

         IloNumArray JMR(env);  //Network Dimension (Jobs, Messages, Routers)
         NumArray2dim CONN(env);               //CONN = Connection    
         IloNumArray SENDER(env);  // Source Job of Messages         
         NumArray2dim DEST(env);               //DEST = Destination         
         IloNumArray MINT(env);  // Minimum Interarrival time of Sporadic Msg., period of periodic msg.
         IloNumArray DUR(env);    // Duration of message transmission (depending on message lenght).
         IloNumArray EXEC(env);   // Computational time of job
         IloBoolArray ALLOCABILITY(env);  //Allocability array

         filename = "Data/10N.dat";
          ifstream f(filename, ios::in);
          if (!f) {
             cerr << "No such file: " << filename << endl;
             throw(1);
          }

          f >> JMR;
          f >> CONN;
          f >> SENDER;
          f >> DEST;      
          f >> MINT;
          f >> DUR;
          f >> EXEC;
          f >> ALLOCABILITY;

    and this is the variables:

     IloNumVarArray ALLOC(env,JMR[0], 0,JMR[2],ILOINT);  // Allocation of jobs to cores (one core connected to one router)     
         IloNumVarArray   HOPS(env, JMR[1], 0, JMR[2], ILOINT);               // Number of hobs for each message
         
         NumVarray2dim PATH(env,JMR[1]);         // Each row is the path for a message.
         for(IloInt i =0; i<JMR[1];++i)
             PATH[i]= IloNumVarArray(env,JMR[2], 0, JMR[2], ILOINT);
         
        BoolVarray2dim ALLOCMATRIX(env,JMR[0]);         // Allocation of jobs to cores.
         for(IloInt i =0; i<JMR[2];++i)
             ALLOCMATRIX[i]= IloBoolVarArray(env,JMR[2]);

        IloNumVarArray INJECTIME(env,JMR[1], 0,MINT[0],ILOFLOAT);
        IloNumVarArray OBJECTI(env,JMR[1], 0.0,1000.0,ILOFLOAT);
         
         NumVarray2dim ONPATH(env,JMR[1]); //ONPATH = Message meets router     
         for(IloInt i=0;i<JMR[1];++i)
             ONPATH[i]= IloNumVarArray(env,JMR[2], 0,1, ILOINT);
     

    Attached the input file.

     

    Best regards,

     


    #CPLEXOptimizers
    #DecisionOptimization


  • 24.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 09:52 AM

    Originally posted by: A_Murshed


    This is the output after "where" command:

    (gdb) r
    Starting program: /home/murshed/cplex_test/test
    [Thread debugging using libthread_db enabled]
    Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
    X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array
    test: include/ilconcert/iloenv.h:2219: X& IloArray<X>::operator[](IloInt) [with X = IloNumVarArray; IloInt = long int]: Assertion `(i < _impl->getSize()) || (std:: cerr << "X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array" << std:: endl, ilo_stop_assert())' failed.

    Program received signal SIGABRT, Aborted.
    0x00007ffff700cbb9 in __GI_raise (sig=sig@entry=6)
        at ../nptl/sysdeps/unix/sysv/linux/raise.c:56
    56      ../nptl/sysdeps/unix/sysv/linux/raise.c: No such file or directory.
    (gdb) where
    #0  0x00007ffff700cbb9 in __GI_raise (sig=sig@entry=6)
        at ../nptl/sysdeps/unix/sysv/linux/raise.c:56
    #1  0x00007ffff700ffc8 in __GI_abort () at abort.c:89
    #2  0x00007ffff7005a76 in __assert_fail_base (
        fmt=0x7ffff71572b0 "%s%s%s:%u: %s%sAssertion `%s' failed.\n%n",
        assertion=assertion@entry=0xfbea98 "(i < _impl->getSize()) || (std:: cerr << \"X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array\" << std:: endl, ilo_stop_assert())",
        file=file@entry=0xfbde89 "include/ilconcert/iloenv.h",
        line=line@entry=2219,
        function=function@entry=0xfbf1a0 <IloArray<IloNumVarArray>::operator[](long)::__PRETTY_FUNCTION__> "X& IloArray<X>::operator[](IloInt) [with X = IloNumVarArray; IloInt = long int]") at assert.c:92
    #3  0x00007ffff7005b22 in __GI___assert_fail (
        assertion=0xfbea98 "(i < _impl->getSize()) || (std:: cerr << \"X& IloArray::operator[] (IloInt i) : Out of bounds operation: index superior to size of array\" << std:: endl, ilo_stop_assert())",
        file=0xfbde89 "include/ilconcert/iloenv.h", line=2219,
        function=0xfbf1a0 <IloArray<IloNumVarArray>::operator[](long)::__PRETTY_FUNCTION__> "X& IloArray<X>::operator[](IloInt) [with X = IloNumVarArray; IloInt = long int]") at assert.c:101
    #4  0x000000000040f016 in IloArray<IloNumVarArray>::operator[] (
        this=0x7fffffffe020, i=4) at include/ilconcert/iloenv.h:2219


     


    #CPLEXOptimizers
    #DecisionOptimization


  • 25.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun December 28, 2014 10:50 PM

    Originally posted by: T_O


    I have to admit that I am not sure whether your arrays contain what you intend them to contain after these commands (I typically use the C API, not concert):

          f >> JMR;
          f >> CONN;
          f >> SENDER;
          f >> DEST;      
          f >> MINT;
          f >> DUR;
          f >> EXEC;
          f >> ALLOCABILITY;

    So, could you please look into these arrays and check whether they look as intended and have the correct size?


    #CPLEXOptimizers
    #DecisionOptimization


  • 26.  Re: Segmentation fault (core dumped) (For large data)

    Posted Mon December 29, 2014 07:54 AM

    Originally posted by: A_Murshed


    Yes, I printed out the input arrays and they are the same as they look in the input file.


    #CPLEXOptimizers
    #DecisionOptimization


  • 27.  Re: Segmentation fault (core dumped) (For large data)

    Posted Tue December 30, 2014 07:30 AM

    Originally posted by: T_O


    Hmm, I am still wondering why the backtrace does not end in the main method. Could you try

    thread apply all backtrace


    #CPLEXOptimizers
    #DecisionOptimization


  • 28.  Re: Segmentation fault (core dumped) (For large data)

    Posted Wed December 31, 2014 09:08 AM

    Originally posted by: A_Murshed


    Dear Mr. Thomas,

    I finally managed to find the line of code that did the segmentation fault.

    In my sixth post, I wrote:

    BoolVarray2dim ALLOCMATRIX(env,JMR[0]);         // Allocation of jobs to cores.
         for(IloInt i =0; i<JMR[2];++i)
             ALLOCMATRIX[i]= IloBoolVarArray(env,JMR[2]);

    I defined an array with JMR[0] values and then use JMR[2] values.

    Now the program works but takes a lot of time. At least the SEGSEGV error disappeared.

     

    Many thanks for your help and my silly mistake.

    Regards,

    Ayman


    #CPLEXOptimizers
    #DecisionOptimization


  • 29.  Re: Segmentation fault (core dumped) (For large data)

    Posted Wed December 31, 2014 12:11 PM

    Originally posted by: T_O


    Remember to use the original makefile. This might speed it up a little again.

    Best regards,
    Thomas


    #CPLEXOptimizers
    #DecisionOptimization


  • 30.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri January 02, 2015 07:36 PM

    Originally posted by: EdKlotz


    Ayman,

                  Since you appear to be on Linux, try running the program, compiled with debug options as Thomas has explained, with Valgrind, a memory checker.    This will have the advantage if identifying the problem right when it happens, whereas running with gdb tends to provide info at the crash point.    Sometimes the crash occurs very close to the problematic line of code, but in many other cases it occurs much later.   For the latter cases, valgrind can be much more effective than gdb.   Valgrind is free and often comes with Linux distributions; try just typing the 'valgrind' command and see if you get something on your machine.   Otherwise, go to http://www.valgrind.org/ for downloads as well as additional docs beyond what is available with the --help command line option.

                   Another useful Linux debugging tool is the Undo debugger.   It's not free, but it can save a lot of time, as it allows you to move both forward and backward in your gdb session.    The URL is http://undo-software.com/.

    Overall, I agree with Thomas that the most likely cause of the trouble here is some inconsistency between the dimensioning of your arrays in your C++ program and either input data or subsequent lines of code that go past the array bounds.   However, if you export LP and SAV files of the problematic model and can reproduce crashes in both files in interactive CPLEX, then you should post those files to this thread.


    #CPLEXOptimizers
    #DecisionOptimization


  • 31.  Re: Segmentation fault (core dumped) (For large data)

    Posted Thu January 08, 2015 03:26 AM

    Originally posted by: AMurshed


    Thanks Mr. Thomas and Mr. Ed.

    The program now keeps on searching for a solution and takes tooo much time to process (more than 10000 s) and then gives me  (killed) output !?

    I have upgraded the CPLEX into 12.6.1 installed on multiprocessor (4 processors) server of type (Intel(R) Xeon(R) CPU E5-2430 0 @ 2.20GHz)

     

    Can anyone gives me a hint on how to speed-up the solution time


    #CPLEXOptimizers
    #DecisionOptimization


  • 32.  Re: Segmentation fault (core dumped) (For large data)

    Posted Thu January 08, 2015 03:51 AM

    If you get the message 'killed' for a program on a Linux/Unix machine then this usually means that your program ran out of memory and was killed by the operating system for that reason.

    To learn how to fight memory issues please read this chapter in the user manual.

    When trying to speed up the solution times, have you tried the tuning tool?


    #CPLEXOptimizers
    #DecisionOptimization


  • 33.  Re: Segmentation fault (core dumped) (For large data)

    Posted Thu January 08, 2015 04:58 AM

    Originally posted by: AMurshed


    Thanks Mr. Daniel,

    Could you please suggest any optimization code.

    I have tried to use :

           cplex.setParam(IloCplex::VarSel, 3);
           cplex.setParam(IloCplex::Probe, 3);
           cplex.setParam(IloCplex::NodeSel, 2) ;
           cplex.setParam(IloCplex::BBInterval, 0) ;
           cplex.setParam(IloCplex::BtTol, 0.0) ;


    But still it keeps working without finding solution.


    #CPLEXOptimizers
    #DecisionOptimization


  • 34.  Re: Segmentation fault (core dumped) (For large data)

    Posted Thu January 08, 2015 05:14 PM

    Originally posted by: EdKlotz


    Ayman,

    >>>>>>>>

    I have tried to use :

           cplex.setParam(IloCplex::VarSel, 3);
           cplex.setParam(IloCplex::Probe, 3);
           cplex.setParam(IloCplex::NodeSel, 2) ;
           cplex.setParam(IloCplex::BBInterval, 0) ;
           cplex.setParam(IloCplex::BtTol, 0.0) ;

    >>>>>>>>

    What prompted you to try these non-default parameters?   Did you try CPLEX with just default parameter settings, and the results were unsatisfactory?    Specifically, if CPLEX has trouble finding any feasible solutions at all, leave the backtrack tolerance (IloCplex::BtTol) at its default rather than setting it to 0 as you do above.   I would also suggest trying a run with the variable selection left at its default rather than set to the computationally expensive setting of 3 (strong branching).

    As Daniel pointed out, the manual has a chapter on dealing with out of memory issues.   Try setting the file parameter to 3 to help in this regard.

    If you still have trouble, please include a CPLEX node log of the slow run in your next post.

     

     

     

     


    #CPLEXOptimizers
    #DecisionOptimization


  • 35.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri January 09, 2015 04:00 AM

    Originally posted by: AMurshed


    Thanks Mr. Ed for the explainantion,

    Now, I tried to change the GAP to 0.01:    cplex.setParam(IloCplex::EpGap, 0.01);

    The best bound is changing slowly.

    Attached an lp file.

     

    Thanks in advence.

     


    #CPLEXOptimizers
    #DecisionOptimization


  • 36.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri January 09, 2015 04:32 PM

    Originally posted by: EdKlotz


    Ayman,

                  I ran CPLEX 12.6.0.1 on my laptop with default settings.    While performance wasn't sspectacular for a model of such modest size, CPLEX did solve it to optimality in under 14 minutes:

     

      16814   351    infeasible             16.0000       10.0000  1850440   37.50%
      17067   116    infeasible             16.0000       13.0000  1945950   18.75%
      17145    40       13.0000  4018       16.0000       13.0000  1980510   18.75%
    Elapsed time = 750.29 sec. (288770.36 ticks, tree = 0.01 MB, solutions = 4)
      17146    41       13.0000  5498       16.0000       13.0000  1986963   18.75%
      17180    12       13.0000  4104       16.0000       13.0000  2011550   18.75%
      17186    12       13.0000  5140       16.0000       13.0000  2026351   18.75%
      17195     5    infeasible             16.0000       13.0000  2041997   18.75%
      17202     4       14.0000  5812       16.0000       14.0000  2056216   12.50%

    GUB cover cuts applied:  285
    Clique cuts applied:  5307
    Cover cuts applied:  2963
    Implied bound cuts applied:  690
    Flow cuts applied:  273
    Mixed integer rounding cuts applied:  673
    Zero-half cuts applied:  675
    Gomory fractional cuts applied:  48

    Root node processing (before b&c):
      Real time             =   24.80 sec. (9493.49 ticks)
    Parallel b&c, 8 threads:
      Real time             =  795.81 sec. (317984.43 ticks)
      Sync time (average)   =  210.61 sec.
      Wait time (average)   =    0.00 sec.
                              ------------
    Total (root+branch&cut) =  820.61 sec. (327477.93 ticks)

    Solution pool: 4 solutions saved.

    MIP - Integer optimal solution:  Objective = 1.6000000000e+001
    Solution time =  820.61 sec.  Iterations = 2060514  Nodes = 17207
    Deterministic time = 327477.97 ticks  (399.07 ticks/sec)

    CPLEX>

     

    What version of CPLEX are you using, and how does this compare to when you run with default settings?


    #CPLEXOptimizers
    #DecisionOptimization


  • 37.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sat January 10, 2015 06:48 AM

    Originally posted by: AMurshed


    Thanks Ed for your reply,

    I use IBM ILOG CPLEX Optimization Studio V12.6.1 installed on multiprocessor (12 processors) server of type (Intel(R) Xeon(R) CPU E5-2430 0 @ 2.20GHz).

    The problem is that the (Best Bound) changed very slowly  and some times it stops without any reason for large problem.

     

    Can you please try the attached lp file and see if it executes and finishes fast , I return all parameters to its default values.

    The problem consists of small network: (4 Jobs, 4 Messages, and 13 nodes)

     

    Thanks in advance,


    #CPLEXOptimizers
    #DecisionOptimization


  • 38.  Re: Segmentation fault (core dumped) (For large data)

    Posted Mon January 26, 2015 03:59 AM

    Originally posted by: AMurshed


    Any suggestions?


    #CPLEXOptimizers
    #DecisionOptimization


  • 39.  Re: Segmentation fault (core dumped) (For large data)

    Posted Sun February 15, 2015 04:30 AM

    Originally posted by: AMurshed


    Hello,

    I am sorry for asking too many questions.

    I just want to know if the order to the constraints plays a significant rule the gives different execution time ???

     

    Thanks in advance.

     

    Ayman


    #CPLEXOptimizers
    #DecisionOptimization


  • 40.  Re: Segmentation fault (core dumped) (For large data)

    Posted Tue February 24, 2015 02:21 AM

    Yes, the order of constraints (or variables) can play a significant role in the execution time. This has been discussed many times on this Forum and you may find more details by searching the Forum.


    #CPLEXOptimizers
    #DecisionOptimization


  • 41.  Re: Segmentation fault (core dumped) (For large data)

    Posted Wed March 04, 2015 05:07 AM

    Originally posted by: AMurshed


    Thanks Daniel for your answer.

    I've finally managed to find the answer on how to decrease the execution time of my CPLEX model.

    I think CPLEX tries to look for all possible paths along while finding solution(s).

    The main drawback of CPLEX is one need to narrow the range of possible variables values in order to let CPLEX find the answer in this defined area. Otherwise, CPLEX will look at all (possible+impossible) paths and then gives you the solution after couple of thousands of seconds.

     

    I think when compared to SAT solver, this is the main drawback of CPLEX solver.


    #CPLEXOptimizers
    #DecisionOptimization


  • 42.  Re: Segmentation fault (core dumped) (For large data)

    Posted Thu March 12, 2015 06:08 AM

    Originally posted by: AMurshed


    Is there a possibility that one can play with the Heuristic search in CPLEX?

    I would like to change the way CPLEX uses the Heuristic search so that I could get better execution time.


    #CPLEXOptimizers
    #DecisionOptimization


  • 43.  Re: Segmentation fault (core dumped) (For large data)

    Posted Fri March 20, 2015 03:42 AM

    In general the frequency with which heuristics are applied is defined by CPX_PARAM_HEURFREQ.

    There are also parameters to explicitly control some heuristics, you can find them by looking through the list of CPLEX parameters.


    #CPLEXOptimizers
    #DecisionOptimization