Last week Stuart King confirmed that JSON as a datasource is now available in v12/ Planning Analytics Engine:
https://community.ibm.com/community/user/businessanalytics/discussion/json-as-datasource-in-a-ti-in-v12
JSON as a data source is already supported in the TM1 12.4 release. In the upcoming 2.0.99 release we are adding a feature to the PAW modeling workbench to support the JSON data source type. It's possible to test JSON as a data source before the PAW 2.0.99 release. Try the following code in the prolog section of a TI process.
DataSourceType = 'JSON';
DataSourceJsonRootPointer = '/value';
DataSourceJsonVariableMapping = JsonAdd( '{}', 'vProp', StringToJson( '/Details/Prop' ));
DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vMsg', StringToJson( '/Message' ));
DatasourceNameForServer = 'data.json';
This code maps two variables in the process (vProd and vMsg) to the JSON properties . The actually data source is still a file (with only JSON content) sitting in the file content area of the v12 database.
To test this I am going to use a free online resource JsonPlaceholder - https://jsonplaceholder.typicode.com/ and use this as the basis for a new TI process to pull data from a remote datasource (in this case an unsecured web page).
As noted above we can use .json files as the datasource, in this instance I want to connect to a 'live' datasource.
This example will use the /users endpoint:
https://jsonplaceholder.typicode.com/users
This contains 10 records in the following JSON format:
{
"id": 1,
"name": "Leanne Graham",
"username": "Bret",
"email": "Sincere@april.biz",
"address": {
"street": "Kulas Light",
"suite": "Apt. 556",
"city": "Gwenborough",
"zipcode": "92998-3874",
"geo": {
"lat": "-37.3159",
"lng": "81.1496"
}
},
"phone": "1-770-736-8031 x56442",
"website": "hildegard.org",
"company": {
"name": "Romaguera-Crona",
"catchPhrase": "Multi-layered client-server neural-net",
"bs": "harness real-time e-markets"
}
}
As a simple test, I want to extract the User and Email details to a text file via a Turbo Integrator process
In the Prolog I 'Get' the data via ExecuteHttpRequest - note that I have used the -o option to write the response to a file
#Region Get Data
sURL = 'https://jsonplaceholder.typicode.com/users';
ExecuteHttpRequest(
'Get',
sURL,
'-h Content-Type: application/json',
'-h Accept: application/json',
'-o data.json'
);
#EndRegion
When I run the process I can see a "data.json" file in my model_uploads folder
When I preview the file within Workspace, I can see it contains the data requested:
- I found it useful to read up on JSON to understand how the data is being structured, what key value pairs are and arrays.
We can set this file as our Datasource and use some new functions to determine the variables
DatasourceRootPointer
A little testing on this but I found that we can either exclude the line or pass a blank string to get this to return all values. Initially, I set this to /value as per the sample but this returned a single row (indicating that it was not the root of the array). It appears that this can be commented out or sent as an empty string
DatasourceJsonVariableMapping
Any variables that are defined here need to be manually added to the Datasource > Variables tab of the TI process:
Each Variable needs to be mapped to element(s) in the array from the JSON Data source. From the sample we saw that this is iterative and we add each new variable to the DataSourceJsonVariableMapping via the JsonAdd function
JsonAdd( «JSON», «Reference», «Value» );
DataSourceJsonVariableMapping is a JSON object, we add a fixed reference (a variable) and finally the location of the variable in the array
We repeat the process for each variable we require and it appears we can traverse up and down the array (indentations) per variable
#Region Set Datasource
DataSourceType = 'JSON';
DataSourceJsonRootPointer = '';
DataSourceJsonVariableMapping = JsonAdd( '{}', 'vName', StringToJson( '/name' ));
DataSourceJsonVariableMapping = JsonAdd( DataSourceJsonVariableMapping, 'vEmail', StringToJson( '/email' ));
DatasourceNameForServer = 'data.json';
#EndRegion
This took some trial and error and exporting the DataSourceJsonVariableMapping confirmed that it could be manually crafted:
"{"vName":"/name","vEmail":"/email"}"
In the Data tab I exported the variables and found I was getting double quotes around each item. Converting the values from JsonToString gave me the expected export in the format I wanted.
#Region Convert Values
vName = JsonToString( vName );
vEmail = JsonToString( vEmail );
#EndRegion
#Region Output values to text file
ASCIIOutput( 'JSONDateExport.txt', vName, vEmail );
#EndRegion
and finally the Output:
"Leanne Graham","Sincere@april.biz"
"Ervin Howell","Shanna@melissa.tv"
"Clementine Bauch","Nathan@yesenia.net"
"Patricia Lebsack","Julianne.OConner@kory.org"
"Chelsey Dietrich","Lucio_Hettinger@annie.ca"
"Mrs. Dennis Schulist","Karley_Dach@jasper.info"
"Kurtis Weissnat","Telly.Hoeger@billy.biz"
"Nicholas Runolfsdottir V","Sherwood@rosamond.me"
"Glenna Reichert","Chaim_McDermott@dana.io"
"Clementina DuBuque","Rey.Padberg@karina.biz"
I could interact with the vName and vEmail variables as I would normally do so in a TI process.
This is a significant move forward in my eyes, we can bring a lot of third party scripts back into IBM Planning Analytics. All REST API endpoints are in Json format which opens up a lot of opportunity.
I have two Blog series coming up to showcase the potential with Json as a datasource:
- To access a large and constantly moving JSON datasource (and hopefully improve my Fantasy Premier League Football team!)
- To Push/ Pull Objects between instances (Sync, Hot Promote, Backup) all within IBM Planning Analytics