EGL Development User Group

EGL Development User Group

EGL Development User Group

The EGL Development User Group is dedicated to sharing news, knowledge, and insights regarding the EGL language and Business Developer product. Consisting of IBMers, HCL, and users, this community collaborates to advance the EGL ecosystem.

 View Only
Expand all | Collapse all

datagrid big data

  • 1.  datagrid big data

    Posted Sun November 08, 2015 05:51 AM

    Hello,

    I m trying to find an optimum way to fill datagrid with many records from an sql and another from db2 database.

    But if data are too large i get an error java heap or it takes too long to fill datagrid.

    Is there such a solution?
     

    Thanks in advance

    michaeldefox


  • 2.  Re: datagrid big data

    Posted Mon November 09, 2015 04:37 AM

    Hi,

    did you tried the paging function of the datagrid?

    And what do you mean with 'big data' ?

    Marcel-D


  • 3.  Re: datagrid big data

    Posted Mon November 09, 2015 04:55 AM

    No, now I'm trying  the dataLoader function.I have large amount of data to fill my datagrid and I want to divide them.

    michaeldefox


  • 4.  Re: datagrid big data

    Posted Mon November 09, 2015 07:47 AM

    My function looks like this: 

     

       function  myDataLoader(startRow int in, endRow int in, sortFieldName string in, 
                        sortDirection int in) returns(boolean)
                  
                
                    sortDirection = DatagridLib.SORT_NONE;

    //                 syslib.writestdout("in dataloader");
                     startrow = 1;          
                     endRow = 50;  

                     return (true);     
         
        end

     

    I want to divide the way they appear on datagrid. I keep the paging but if my data are too large it takes too long to appear on screen even with the showbuttonbars
         

    michaeldefox


  • 5.  Re: datagrid big data

    Posted Tue November 10, 2015 02:19 AM

    Hi Michael,

     

    Whether you're using DataGrid paging or not, the full set of records will be loaded in your DataGrid and thus be present in your browser-memory.

    The pages in our ERP application (lots and lots of data!) never contain more than 100 records. If the data the user is searching for isn't in the first 100 hits, they should use more specific search arguments.

    This keeps our application performing well.

     

    When you have such amounts of data causing Java heap issues (server side) or browser wait time, it might be the right moment to consider writing your own paging mechanism.

    Creating the paging mechanism I'm suggesting shouldn't be too hard. 

    I would suggest fetching no more than 100 records at a time and create your own paging buttons.

    When paging forward you fetch the next 100 records based on the last record in the datagrid (provided the datagrid's records contain the table's primary key).

    When paging backward you fetch the previous 100 records based on the first record in the datagrid.

     

    Regards,

    Guus

    gweis


  • 6.  Re: datagrid big data

    Posted Tue November 10, 2015 03:06 AM

    That is how we solved it as well: paging buttons and a backend function which has a minimum of two parameters:

    • startIndex (integer)
    • numberOfRecords (integer)

    We then use the prepare statement and a foreach to scroll through the resultset, and only put records in the return array if the counter is in the correct range.

    This keeps the application fast and light in memory.

     

    I also found that the datagrid can be slow for some applications because it supports a lot of features. For that reason we have made a simple widget, very similar in use to a datagrid, but without all the advanced features of a datagrid like sorting, editorbehaviour, formatting, ... Instead of floating divs we have used a simple table. This works ok if you want to display something like 1000 records. 

    To sum up: we have 3 methods of displaying an array of records:

    • < 200 records: datagrid
    • 200 to 1000: custom table-style widget
    • 1000+ : custom datagrid with custom backend

    If you are getting java heap space problems (= memory problems on the service side), only the 3rd option will solve this.

     

    I hope this helps.

     

    Kind regards,

     

    Bram

    Bram_Callewaert


  • 7.  Re: datagrid big data

    Posted Wed November 11, 2015 04:15 AM

    Thanks you all!

    I' m trying to limit my data with sql prepare statement for db2 using statement :

    select *from (select  field1,....fieldn,rownumber() over () as rn) as x  where rn between 0 and 1000

     

    along with that:

     

     try

                open DataTableC for NameRec with preparedStatement;
                //Fill array for DATA TABLE
                forEach(NameRec)
                    i = i + 1;
                    syslib.writestdout("i: "+i);
                    //add rows to array
                   
                      array.insertElement(NameRec, i);
                //syslib.writeStdout("size of arrray:"+array.getSize());    
                end
                

    but I only get ~80 records. Do I have to define x alias or is it something else? I don't get all 1000 records with that solution.  Thanks

    michaeldefox


  • 8.  Re: datagrid big data

    Posted Wed November 11, 2015 10:32 AM

    What about trying FETCH FIRST 1000 ROWS ONLY instead of testing the rrn

    Rick

    RickGentner


  • 9.  Re: datagrid big data

    Posted Wed November 11, 2015 12:35 PM

    Number 1000 will be like an offset
    select *from (select  field1,....fieldn,rownumber() over () as rn) as x  where rn between startrow  and endrow 

    in order to  divide the full set of my records into smaller  pieces with 1000 records each one.

    michaeldefox


  • 10.  Re: datagrid big data

    Posted Thu November 12, 2015 04:18 AM

    Is it really necessary to limit the selected records within the sql record?

     

    My performance issues seem to be resolved by using a prepare and an open statement, and then iterating through the resultset with a foreach.

     

    I don't know exactly what happens in the prepare and open statements.

    If you have a properly indexed table, is the next record looked up at the time of the foreach statement in the database, or are all the records required by the select statement already looked up when the open or prepare statement is run?

     

    Kind regards,

     

    Bram

    Bram_Callewaert


  • 11.  Re: datagrid big data

    Posted Thu January 07, 2016 02:18 PM

    The prepare statement creates execution plan.

    The open statement executes the plan and looks up records. Whether every record is looked up or not depends on commitment control parameters such as READ_COMITED etc.

    We also faced similar issues and I am brainstorming here.

    1. If there is no cap on the record count

        a. If user closes the browser session, are you left with hanging locks?

        b. Since no data is sent to the browser during the "foreach" loop, how do you even show progress bar?

     

    2. Since browser sessions need to be stateless, do you implement "next" as "next from now" or "next from when I last issued the query"?

     

    Regards,

     

    Cool_Anand


  • 12.  Re: datagrid big data

    Posted Fri January 08, 2016 04:03 AM

    1.a: Our web services are stateless. So the memory consumption only lasts the duration of the service call-response.

    The server side does not keep locks outside of the call-response cycle. Commitment control ends after the service call is completed.

    For us this means record locks are released after the response is sent to the front-end (Rich UI).

     

    1.b: We don't have a progress bar, because it is not necessary if the response is very fast.

     

    2: We issue a get next with an integer specifying the start index within the resultset

     

    Downsides:

    No frontend searching or sorting

    If you want to be able to jump to the last page, you need an extra return value specifying the total number of records returned by the query. You could loop through all the results, but we chose to execute a separate query with a count, because this is much faster than retrieving all the results.

    so: the process for searching with us is:

    User enters a search

    • The rui application calls a service with the following parameters: search parameters, startindex (1), numberOfRows (20)
    • The web service executes a prepare and an open statement. (at this time I believe nothing is retrieved in our java web service yet, no load on the heap space)
    • The web service executes a for statement, retrieving record by record with a get next.
    • Only when our for counter is at startindex (1 in this case), we start by copying the result from the get next into our result array.
    • If we are at startIndex + numberOfRows (21 in this case), we exit the for loop and return the results in the result array to the rich ui application. 
    • All locks on the database are released, because the logical unit of work (on the service side) has completed.
    •  

    When the users asks for the next page the same logic is ran again, but with startindex 21 and numberOfRows 20.

    The first 20 records are looped through but not added to the result array. records 21 through 41 are added. At 41 we return the results.

     

    Kind regards,

     

    Bram

     

    Bram_Callewaert