When requesting an access token from IBM Cloud we are notified that the token is valid for 60 minutes. I have built a process to check if the token exists and if it does not then to re-trigger the Get Access Token process. However, I need to ensure that the token is cleared once it has expired.
A simple method to achieve this is to schedule a chore to run in 60 minutes to run a process to clear the access token.
In v11 and earlier we would have to manually schedule this chore or we could use the REST API via powershell, python/ TM1Py, javascript etc.. etc..
With ExecuteHttpRequest we are able to Enabled/ Disable Chore schedules directly inside TI:
#Region Header
# Get.PlanningAnalytics.Access.Token Process
# Call IBM Cloud API to get Access Token for PA and Schedule Chore to clear token on expiry
#EndRegion
#Region Declare Variables
sCubesysAPI = 'sys.API';
sCubesysAPIToken = 'sys.API.Token';
sPlanningAnalyticsAPI = CellGetS( sCubesysAPI, 'PlanningAnalytics', 'String' );
sPlanningAnalyticsTokenUrl = 'https://iam.cloud.ibm.com/identity/token?apikey=' | sPlanningAnalyticsAPI | '&grant_type=urn:ibm:params:oauth:grant-type:apikey';
#EndRegion
#Region Check if Access_Token already exists
if(CellGetS( sCubesysAPIToken, 'PlanningAnalyticsToken', 'String' ) @= '');
#Region Make Post Request to get Access_Token
ExecuteHttpRequest(
'Post',
sPlanningAnalyticsTokenUrl
);
#EndRegion
#Region Extract Access Token from Response
responseCode = HttpResponseGetStatusCode;
vResponseBody = HttpResponseGetBody;
vResponseBodyLength = LONG (vResponseBody);
vAccessTokenStr = '"access_token":"';
vAccessTokenStart = SCAN( vAccessTokenStr, vResponseBody);
vAccessTokenLength = LONG( vAccessTokenStr);
vResponseBodyAccessTokenStrToEnd = SUBST( vResponseBody, vAccessTokenStart + vAccessTokenLength, vResponseBodyLength - vAccessTokenStart );
vAccessTokenEnd = SCAN( '"', vResponseBodyAccessTokenStrToEnd ) - 1;
vBearerToken = SUBST( vResponseBodyAccessTokenStrToEnd, 1, vAccessTokenEnd );
#EndRegion
#Region Update Token Cube
CellPutS( vBearerToken, sCubesysAPIToken, 'PlanningAnalyticsToken', 'String' );
#EndRegion
#Region Set Clear Access_Token Chore
sTenantId = CellGetS( sCubesysAPI, 'PATenantId', 'String' );
sTerritory = CellGetS( sCubesysAPI, 'PATerritory', 'String' );
sModelName = <MyModelName>;
sBaseUrl = 'https://' | sTerritory | '.aws.planninganalytics.ibm.com/api/' | sTenantId | '/v0/tm1/' | sModelName | '/api/v1/';
sObject = 'Chores';
sObjectName = 'Clear.PlanningAnalytics.Access.Token';
sSuffix = '(''' | sObjectName |''')';
sAction = '/tm1.Activate';
sURL = sBaseUrl | sObject | sSuffix | sAction;
ExecuteHttpRequest(
'POST',
sURL,
'-h Content-Type:application/json',
'-h Authorization: Bearer ' | vBearerToken
);
#EndRegion
endif;
The Clear.Planning.Analytics.Token chore runs a single process of the same name:
#Region Header
# Clear.PlanningAnalytics.Access.Token Process
# Deactivates Clear.PlanningAnalytics.Access.Token chore and clears PA Access Token
#EndRegion
#Region Declare Variables
sCubesysAPI = 'sys.API';
sCubesysAPIToken = 'sys.API.Token';
#EndRegion
#Region Turn off Clear Access Token Chore
sTenantId = CellGetS( sCubesysAPI, 'PATenantId', 'String' );
sTerritory = CellGetS( sCubesysAPI, 'PATerritory', 'String' );
sModelName = <MyModelName>;
vPlanningAnalyticsToken = CellGetS( sCubesysAPIToken, 'PlanningAnalyticsToken', 'String' );
sBaseUrl = 'https://' | sTerritory | '.aws.planninganalytics.ibm.com/api/' | sTenantId | '/v0/tm1/' | sModelName | '/api/v1/';
sObject = 'Chores';
sObjectName = 'Clear.PlanningAnalytics.Access.Token';
sSuffix = '(''' | sObjectName |''')';
sAction = '/tm1.Deactivate';
sURL = sBaseUrl | sObject | sSuffix | sAction;
ExecuteHttpRequest(
'POST',
sURL,
'-h Content-Type:application/json',
'-h Authorization: Bearer ' | vPlanningAnalyticsToken
);
#EndRegion
#Region Clear Access_Token for User
CellPutS( '', sCubesysAPIToken, 'PlanningAnalyticsToken', 'String' );
#EndRegion
It makes more sense to store the ModelNames in a cube as with the various Base URLs, I am using ExecuteHttpRequest to send files direct to GitHub so as these update in future a link to those resources will be available,
#IBMChampion#ibmchampions-highlights#ibmchampions-highlights-home