Planning Analytics

Planning Analytics

Get AI-infused integrated business planning

 View Only
  • 1.  Writing the file to a SQL table

    Posted Wed September 25, 2024 09:48 AM
    Hi,
     
    I have a file in the model_upload folder in IBM Planning Analytics on Cloud.
    I want to write the data in this file to SQL table. How can I write the file to SQL table in TI process? When I run the code, it cannot find the file.
    Thank you.
    My TI code is:



    ------------------------------
    Catalyst Test
    ------------------------------


  • 2.  RE: Writing the file to a SQL table

    Posted Thu September 26, 2024 03:28 AM

    Hi,

    the SQL server and the account it is running with needs to have direct access to the folder/file for a bulk insert. I assume, that's not the case here and probably not possible at all with PAoC?

    I think I remember, that in a feature presentation of IBM I've seen the other day, access via SFTP or SSH or likewise was announced. This might help you moving the file to a folder where SQL server has access to.

    Another solution would be to read the file with a TI process and to use insert(...) statements to load the content to the SQL server, though this is less performant than a bulk insert.

    Kind regards

    Matthias



    ------------------------------
    Matthias Holthus
    ------------------------------



  • 3.  RE: Writing the file to a SQL table

    Posted Thu September 26, 2024 07:35 AM
    Edited by STUART KING Thu September 26, 2024 07:49 AM

    Hi Catalyst,

    You will need to set the cma file as the data source for the Turbo Integrator process.  The values in each column can be read into the process variables and the variables can be referenced in INSERT statement in your vSQL variable.

    There is no function in Turbo Integrator processes (at least not in TM1 v11) that can read a data source file in the middle of the process. 

    Edit: You might not want to do one INSERT statement per line in the data source file.  You can create a single insert statement that inserts multiple records at a time.  This would be done by building a longer INSERT statement using string concatenation over multiple iterations of the data section of the TI process.   



    ------------------------------
    Stuart King
    Product Manager
    IBM Planning Analytics
    ------------------------------



  • 4.  RE: Writing the file to a SQL table

    Posted Fri September 27, 2024 02:09 PM

    Hi Stuart,

    could you give a short example of the iterations process. That would be very helpful.



    ------------------------------
    Norbert Jungk
    ------------------------------



  • 5.  RE: Writing the file to a SQL table

    Posted Fri September 27, 2024 04:51 PM

    The CSV file is the data source (but you could do the same with a cube view as a data source).  In this example the data source variables are v1, v2, v3.

    In the prolog section the TI code would look like this:

    vSQL = 'INSERT INTO dbo.Table1 ( col1, col2, col3 ) VALUES';
    rowCount = 1;

    In the data section of the process, you would have something like this:

     vSQL = vSQL | ' ( ''' | v1 | ''', ''' | v2 | ''', ''' | v3 | ''' )';
    IF ( MOD ( rowCount, 25 ) = 0 );
     ODBCOpen ( 'Sample', 'sa', '***********' );
       #ASCIIOutput ( '..\sqldebug.txt', vSQL );
       ODBCOutput ( 'Sample', vSQL ); 
       ODBCClose ( 'Sample' );   
       vSQL = 'INSERT INTO dbo.Table1 ( col1, col2, col3 ) VALUES';
    ELSE;
       vSQL = vSQL | ', ';
    ENDIF;
    rowCount = rowCount + 1;

    This will do a SQL INSERT every 25 lines of the data source (MOD is the modulus function in TM1).  You might want to play around with different modulus numbers to determine best performance.

    Don't forget to do one more ODBCOutput function in the Epilog section of the process with the remainder of the lines form the data source.  You will probably need to remove the last comma character.



    ------------------------------
    Stuart King
    Product Manager
    IBM Planning Analytics
    ------------------------------