Planning Analytics

 View Only
Expand all | Collapse all

JSON as datasource in a TI in v12

  • 1.  JSON as datasource in a TI in v12

    Posted Thu October 10, 2024 03:52 AM

    Hello, 

    is there already a fix date, when it will be possible to use JSON as datasource? 

    According to the AHA-overview, it should be in Q4 2024, so I think, it should be possible with PAW_100 that is to come on November or Start of December, am I right? 

    Thanks in advance!

    Vitalij



    ------------------------------
    Vitalij Rusakovskij
    ------------------------------


  • 2.  RE: JSON as datasource in a TI in v12

    Posted Fri October 11, 2024 08:07 AM

    Hi Vitalij,

    JSON as a data source is already supported in the TM1 12.4 release.   In the upcoming 2.0.99 release we are adding a feature to the PAW modeling workbench to support the JSON data source type.  It's possible to test JSON as a data source before the PAW 2.0.99 release.  Try the following code in the prolog section of a TI process.

    DataSourceType                = 'JSON';

    DataSourceJsonRootPointer     = '/value';
    DataSourceJsonVariableMapping = JsonAdd( '{}', 'vProp', StringToJson( '/Details/Prop' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vMsg', StringToJson( '/Message' ));
    DatasourceNameForServer       = 'data.json';

    This code maps two variables in the process (vProd and vMsg) to the JSON properties .  The actually data source is still a file (with only JSON content) sitting in the file content area of the v12 database.

    Here is a sneak preview of what you will see in the TI editor in PAW 2.0.99:



    ------------------------------
    Stuart King
    Product Manager
    IBM Planning Analytics
    ------------------------------



  • 3.  RE: JSON as datasource in a TI in v12

    Posted Fri October 11, 2024 11:31 AM

    Thanks Stuart for your reply and also example. 

    We will test it and are looking forward to the new PAW-version!

    It will be great!



    ------------------------------
    Vitalij Rusakovskij
    ------------------------------



  • 4.  RE: JSON as datasource in a TI in v12

    Posted Tue October 15, 2024 11:04 AM
    Edited by Wim Gielis Tue October 15, 2024 11:06 AM

    Hello Stuart,

    I can have this working, also thanks to Edward Stuart with his blog post.

    What I could not do, is extracting the Name and /Titles/Selected information. Below is a JSON specification.

    What should the DataSourceJsonVariableMapping be please ? Key-value pairs seems easy, an array seems more difficult ?

    {
        "@type":"NativeView",
        "Name":"Analysis_0f309cd0-4215-4482-8471-80ef000fcbb3_[}tp_tasks}0f309cd0-4215-4482-8471-80ef000fcbb3].[101]",
        "Columns":
        [
            {
                "Subset":
                {
                    "@id":"Dimensions('Month')/Hierarchies('Month')/Subsets('QY')"
                }
            }
        ],
        "Rows":
        [
            {
                "Subset":
                {
                    "@id":"Dimensions('Account')/Hierarchies('Account')/Subsets('Default')"
                }
            }
        ],
        "Titles":
        [
            {
                "Subset":
                {
                    "@id":"Dimensions('Year')/Hierarchies('Year')/Subsets('Default')"
                },
                "Selected":
                {
                    "@id":"Dimensions('Year')/Hierarchies('Year')/Elements('Y2')"
                }
            },
            {
                "Subset":
                {
                    "@id":"Dimensions('organization')/Hierarchies('organization')/Subsets('Workflow')"
                },
                "Selected":
                {
                    "@id":"Dimensions('organization')/Hierarchies('organization')/Elements('Total Company')"
                }
            }
        ],
        "SuppressEmptyColumns":false,
        "SuppressEmptyRows":false,
        "FormatString":"",
        "ViewExtractSkipConsolidatedValues": false
    }



    ------------------------------
    Wim Gielis
    ------------------------------



  • 5.  RE: JSON as datasource in a TI in v12

    Posted Tue October 15, 2024 11:54 AM

    I am fairly sure I got this working by using the /value as the RootPointer but I'm not clear on why I needed /value when the Root array in the Json was not named?

    DataSourceJsonRootPointer = '/value';

    DataSourceJsonVariableMapping = JsonAdd( '{}', 'vName', StringToJson( '/Name' ));

    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vSelected', StringToJson( '/Titles/Selected' ));



    ------------------------------
    Edward Stuart
    Solutions Director
    Spitfire Analytics Limited
    Manchester
    ------------------------------



  • 6.  RE: JSON as datasource in a TI in v12

    Posted Tue October 15, 2024 12:04 PM

    With the last code, for me, that part on Titles Selected comes back as empty in the variable.



    ------------------------------
    Wim Gielis
    ------------------------------



  • 7.  RE: JSON as datasource in a TI in v12

    Posted Wed October 16, 2024 05:07 AM

    Hoi Wim,

    First up, you can parse any JSON document/object with JSON functions. Using JSON as a data source allows you to have TM1 break a collection of JSON 'documents' (read: objects/arrays/values or even a mix thereof), either presented as an array in a JSON document or as a form of concatenated JSON, into individual documents which you can then further process.

    I would never have thought about processing the JSON specification for a TM1 view in a TI process, let alone as a data source, but we can work with that ;-)

    So first thing you need to understand is that the DataSourceJsonRootPointer represents a JSON Pointer identifying the root of the source. All samples I see in this thread use `/value`, likely because I used that in the samples I created when I implemented these JSON functions and support for the JSON Data Source. This works if, like in my examples, the response is a collection returned by an OData service where the collection is wrapped by an object which has a property named `value`. In your example, this makes no sense as there is no property named `value`.

    So, first question you need to ask yourself is what it is you consider the collection. Given the fact that you appear to want to do something with both the Name and the Titles then the collection must be containing documents that have both. I.o.w. your source in this case is a collection of one document/record, the one and only document in this JSON file (I told you I would never expect you'd try to do this but you can;-). In this case you could set up the data source as:

    DataSourceType                = 'JSON';
    DataSourceJsonRootPointer     = '/';
    DataSourceJsonVariableMapping = JsonAdd( '{}', 'vName', StringToJson( '/Name' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vTitles', StringToJson( '/Titles' ));

    In your metadata or data code, which will be visited only once as there is only one document/record in this source, your vName variable would contain the JSON representation of the Name of the View and the vTitles variable would contain the JSON for the Titles array. You can then, using the new JSON functions, iterate that array and grab the `/Selected/@id` from each title individually and do what you'd want to do with those.

    ALTERNATIVELY, if we'd presume for a sec that this is some random JSON document and the only thing you'd be interested in is that collection of Titles then you could set the data source up in a way that each Title would become a record where you'd have a vElementId variable that would have that selected Id for the element already as in:

    DataSourceType                = 'JSON';
    DataSourceJsonRootPointer     = '/Titles';
    DataSourceJsonVariableMapping = JsonAdd( '{}', 'vElementId', StringToJson( '/Selected/@id' ));

    Now in your metadata and data code, which would be called twice in this example for each of the titles, you'd have the vElementId variable set to the selected element of that Title.

    The thing to keep in mind is that a data source is a collection of JSON documents (the records), which might or might not have the same 'shape', and all the variables you declare in the variable mapping end up containing the respective JSON values if defined in such document/record (or empty string if not defined).

    Hope that helps,



    ------------------------------
    Hubert Heijkers
    STSM, Program Director TM1 Functional Database Technology and OData Evangelist
    ------------------------------



  • 8.  RE: JSON as datasource in a TI in v12

    Posted Wed October 16, 2024 05:57 PM

    Hello Hubert,

    Thanks ! It is starting to make more sense now :-)

    As to why the view JSON, that was the first JSON file I had at hand, that is why I used that for testing.

    Your solution 1, I can get it to work, thanks !

    #Section Prolog
    DataSourceType                = 'JSON';
    DatasourceNameForServer       = 'data_wim.json';
    DataSourceJsonRootPointer     = '';
    DataSourceJsonVariableMapping = JsonAdd( '{}', 'vName', StringToJson( '/Name' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vTitles', StringToJson( '/Titles' ));
    #Section Metadata
    #Section Data
    ASCIIOutput( 'test_read.txt', vName, vTitles );
     
    s = JsonType( vTitles );
    ASCIIOutput( 'test_read.txt', s );
     
    s = NumberToString( JsonSize(  vTitles ));
    ASCIIOutput( 'test_read.txt', s );
     
    m = 0;
    while( m < JsonSize( vTitles ));
        vSubset = JsonGet( vTitles, m );
        vSubset_ID = JsonGet( vSubset, '/Subset/@id' );
        vSelected = JsonGet( vSubset, '/Selected/@id' );
        ASCIIOutput( 'test_read.txt', NumberToString( m ), vSubset_ID, vSelected );
     
        m = m + 1;
    end;
    #Section Epilog
    With respect to your solution 2, I can get it working but partially. I get the 'Selected' records:
    #Section Prolog
    DataSourceType                = 'JSON';
    DatasourceNameForServer       = 'data_wim.json';
    DataSourceJsonRootPointer     = '/Titles';
    DataSourceJsonVariableMapping = JsonAdd( '{}', 'vElementID', StringToJson( '/Selected' ));
    #Section Metadata
     
    #Section Data
    ASCIIOutput( 'test_read.txt', vElementID );
    #Section Epilog
    But asking for /Selected/@id in the variable returns empty.
    I can probably do a JsonGet( ) on vSelected to get the @id but I think that we should be able to retrieve it directly as in your sample.
    Any idea how to get the solution 2 working in a direct way without JsonGet ?
    Thanks !


    ------------------------------
    Wim Gielis
    ------------------------------



  • 9.  RE: JSON as datasource in a TI in v12

    Posted 6 hours ago

    Hi,

    Thank you for this it was very helpful.  I have been able to create the below and it runs well.  One issue I can see is if there are any limits in the API for example we can only retrieve 100 records at one time and we need to loop a number of times to retrieve all the records.  Is there any way to merge 2 JSON files.

    #Section Prolog
    cClientId =< ClientId >;
    cClientSecret = < ClientSecret >;
    cRefreshToken = < RefreshToken  >;
    cPortalID = < PortalID >;


    # # Endpoint to refresh token
    sTokenEndpoint = 'https://accounts.zoho.com.au/oauth/v2/token?refresh_token=' | cRefreshToken | '&client_id=' | cClientId | '&client_secret=' | cClientSecret | '&grant_type=refresh_token';

    ExecuteHttpRequest(
        'POST',
        sTokenEndpoint,
        '-o dataToken.json'
    );

    responseCode = HttpResponseGetStatusCode(); 
    vResponseBody  = HttpResponseGetBody(); 
    vResponseBodyLength = LONG ( vResponseBody ); 
    vAccessTokenStr = '"access_token":"'; 
    vAccessTokenStart = SCAN( vAccessTokenStr, vResponseBody ); 
    vAccessTokenLength = LONG( vAccessTokenStr ); 
    vResponseBodyAccessTokenStrToEnd= SUBST( vResponseBody, vAccessTokenStart + vAccessTokenLength, vResponseBodyLength - vAccessTokenStart); 
    vAccessTokenEnd = SCAN( '"', vResponseBodyAccessTokenStrToEnd ) - 1; 
    vBearerToken = SUBST( vResponseBodyAccessTokenStrToEnd, 1, vAccessTokenEnd ); 

    sProjectsEndpoint = 'https://projectsapi.zoho.com.au/restapi/portal/' | cPortalID | '/projects/?&index=1&range=99';

    ExecuteHttpRequest(
        'GET',
        sProjectsEndpoint,
       '-h Authorization: Bearer ' | vBearerToken,
      '-h Content-Type: application/json',
        '-h Accept: application/json',
        '-o dataProject.json'
    );
    #Region Set Datasource
    DataSourceType = 'JSON';

    DataSourceJsonRootPointer     = '/projects';
    DataSourceJsonVariableMapping = JsonAdd( '{}', 'vId', StringToJson( '/id' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vName', StringToJson( '/name' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vStatus', StringToJson( '/status' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vbudget_value', StringToJson( '/budget_value' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vRate', StringToJson( '/rate' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vKey', StringToJson( '/key' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vCustomStatusName', StringToJson( '/custom_status_name' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vOwnerName', StringToJson( '/owner_name' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vStatus_Report', StringToJson( '/custom_fields/0/Status_Report' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vStatusDetails', StringToJson( '/custom_fields/1/Status_Details' ));
    DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vBillingType', StringToJson( '/billing_type' ));

    DatasourceNameForServer       = 'dataProject.json';

    #Section Data
    #Region Convert Values
    vId = JsonToString( vId );
    vName = JsonToString( vName );
    vStatus = JsonToString( vStatus );
    vbudget_value = JsonToString( vbudget_value );
    vRate = JsonToString( vRate );
    vKey = JsonToString( vKey );
    vCustomStatusName = JsonToString( vCustomStatusName );
    vOwnerName = JsonToString( vOwnerName );
    vStatus_Report = JsonToString( vStatus_Report );
    vStatusDetails = JsonToString( vStatusDetails );
    vBillingType = JsonToString( vBillingType );
    #EndRegion

    ASCIIOutput( 'test_read.txt', vId, vName, vStatus, vbudget_value, vRate, vKey, vCustomStatusName, vOwnerName, vStatus_Report, vStatusDetails, vBillingType );



    ------------------------------
    Campbell Claridge
    ------------------------------



  • 10.  RE: JSON as datasource in a TI in v12

    Posted 5 hours ago

    What happens when you change the sProjectsEndpoint index and range values?

    sProjectsEndpoint = 'https://projectsapi.zoho.com.au/restapi/portal/' | cPortalID | '/projects/?&index=1&range=99';



    ------------------------------
    Edward Stuart
    Solutions Director
    Spitfire Analytics Limited
    Manchester
    ------------------------------



  • 11.  RE: JSON as datasource in a TI in v12

    Posted 19 minutes ago

    Hi Campbell,

    Technically you definitely could merge 2 or more JSONs, but realistically, that is, at this point in time, you should only do so for relatively small JSONs, not heaps of JSON.

    That said, I think you are on a path already, you could clearly create a TI process that iterates the chunks of records you'd be retrieving, then using a separate TI like yours above, pass in the index (and perhaps offset) variables to build your URL and have every iteration if this process wrote to the same text files using append, to build one file with all the data you'd end up loading in yet another TI after you received all the chunks.

    If you want to build in some extra resilience you could even consider using TM1s REST API again and use the Multi-Part Upload support to be able to recover from issues while loading separate chunks.

    In the example above you are creating a CSV file, and that's what I'd do too as that's more efficient then JSON at the end of the day anyway, but, if you wanted to prove that JSON works as well you could create concatenated JSON file as well, you could even map the whole object from the source into one variable and simply write that the output. Keep in mind you have to specify empty as your separator before writing to file as it needs to be valid concatenated JSON, if you don't format it (which you wouldn't) JSON-Lines.

    Hope this helps, let me know if you there is anything you'd like me to elaborate on.

    Cheers,



    ------------------------------
    Hubert Heijkers
    STSM, Program Director TM1 Functional Database Technology and OData Evangelist
    ------------------------------



  • 12.  RE: JSON as datasource in a TI in v12

    Posted 50 minutes ago

    Hoi Wim,

    Apologies for the late response, the issue with @id is related to me using our so called reviver from our OData framework to handle the JSON stream parsing which things that start with an @ as control information/annotations and as such those don't show up. I'll have to fix that, which I was planning to do before I'd let you know but haven't thus far. (Note to self: issue #3735)

    TBC!



    ------------------------------
    Hubert Heijkers
    STSM, Program Director TM1 Functional Database Technology and OData Evangelist
    ------------------------------