Decision Optimization

 View Only
Expand all | Collapse all

time optimization

  • 1.  time optimization

    Posted Wed July 01, 2020 04:02 AM
    I have written code for a scheduling problem in opl (.mod and .dat files) and I have used doopl python interface for calling those two .mod and .dat code and performing all the statistical operations.
    The code is working fine and finding the solution in around 7 minutes time, but after solving(finding the solution), it is taking around 20-25 minutes for writing the data to a csv files and generating them. I have created the csv files and writing to them in .mod file. I have also tried to create those csv files in doopl python instead of .mod, but even then it is taking long time for creating of csv files.
    How can I reduce that time?
    Please help..
    Thank you.

    ------------------------------
    Subhas Chatterjee
    ------------------------------


  • 2.  RE: time optimization

    Posted Wed July 01, 2020 04:18 AM
      |   view attached
    Hi,

    can you check that you do not rely on too large loops in the scripting part ?
    What would be more efficient is to do the computing in OPL postprocessing part and not do the large loops in the scripting part

    regards


    ------------------------------
    ALEX FLEISCHER
    ------------------------------

    Attachment(s)

    PDF
    efficientmodeling.PDF   318 KB 1 version


  • 3.  RE: time optimization

    Posted Wed July 01, 2020 04:23 AM
    You will have to show us what exactly you are doing, otherwise we cannot tell how this could be improved.
    In general, it is probably faster to do things in Python rather than using OPLScript in .mod. But maybe your Python code is still inefficient. Like I said, please show the code to be improved.

    ------------------------------
    Daniel Junglas
    ------------------------------



  • 4.  RE: time optimization

    Posted Wed July 01, 2020 08:27 AM
    Edited by Subhas Chatterjee Thu July 02, 2020 02:41 AM


  • 5.  RE: time optimization

    Posted Wed July 01, 2020 09:23 AM
    Did you try to directly write the csv file? It looks like unncessary overhead to first create a huge dataframe only to then export it to csv. Also, have you considered creating the dataframe with the right size from the start. If I read your code correctly then you keep adding new rows to the dataframe. This repeated resizing may slow down the code. Maybe things get faster if you create the dataframe with the right size initially?

    ------------------------------
    Daniel Junglas
    ------------------------------



  • 6.  RE: time optimization

    Posted Wed July 01, 2020 09:39 AM
    Edited by Subhas Chatterjee Thu July 02, 2020 02:42 AM


  • 7.  RE: time optimization

    Posted Wed July 01, 2020 09:42 AM
    Like I said: it may be better to create the CSV files directly without taking a detour through a dataframe. I suggest to try this and see whether it helps.

    ------------------------------
    Daniel Junglas
    ------------------------------



  • 8.  RE: time optimization

    Posted Wed July 01, 2020 09:58 AM
    Edited by Subhas Chatterjee Thu July 02, 2020 02:42 AM