Planning Analytics

 View Only

 How to Retrieve and Convert a Cube's Rules File (.rux) in PAaaS

zied ben hmida's profile image
zied ben hmida posted Thu August 21, 2025 06:07 AM

Hello,

I am working on Planning Analytics Workspace as a Service 2.0.106.

I am trying to fix a process that gets a cube's rules file (.rux) and then convert it to a .txt file and store it in my File Manager inside PAaaS.

In the Planning Analytics Local version, the process executed a command line via ExecuteCommand function, and was retrieving the .rux file from the TM1data folder (see ti process below)

```
TM1Path= CellGetS( 'z_Admin_Param' , 'REP_TM1DATA' ,'STR_VAR1') |'\';
Query= 'cmd /c copy "'| TM1Path| 'Rule1.rux" "'| TM1Path| 'textFile1.txt"' ;
ExecuteCommand(Query,0);
```

How can I retrieve the .rux file in the PAaaS version ?

Thank you in advance,

Best regards,
Zied

zied ben hmida's profile image
zied ben hmida

Ti Code:

RuleText = CubeRuleGet('Calendar');
TextOutput('Files/textFile1.txt', RuleText);

This retrieves the rule script for the cube named Calendar and writes it to the textFile1.txt .I don't think accessing .RUX is directly possible on PAaaS

George Tonkin's profile image
George Tonkin IBM Champion

Have a look at some of the newer functions like CubeRuleGet (per post above) you could assign this to a cube/string cell.
However, on PAaaS and in the containerised environment, not sure if that will help but may be something to work with...

zied ben hmida's profile image
zied ben hmida

Hi George I tested the CubeRuleGet function in PAaaS and it works perfectly for my use case. I'm now able to retrieve the cube's rules and store them as a .txt file in the File Manager.

Appreciate your help!

George Tonkin's profile image
George Tonkin IBM Champion

Excellent! I think you had it anyway - I was just not sure where the file would be written but if you can access in the File Manager that is great.
Also possibly good to know this if we need to test those situations where we do a bulk load and remove rules then at the end load again.
Thanks for confirming - will post back when I get around to testing the export, delete, reload one day.

zied ben hmida's profile image
zied ben hmida

Just for additional information, I have the impression that the function is limited to about 66,000 characters in a RUX file :

i ve already open a support ticket to IBM to have the confirmation :) 

Wim Gielis's profile image
Wim Gielis IBM Champion

If it is TM1 V12 then use the function ExecuteHttpRequest. A GET request to the relevant endpoint is sufficient.

Hubert Heijkers's profile image
Hubert Heijkers

Or, if all you want is just the text representation of the rules of a cube, you could grab those directly through the REST API. In the case of PAaaS you'd simply, if you are already in PAaaS already, use a URL, even in the browser, like:

https://us-east-1.planninganalytics.saas.ibm.com/api/<<your-tenantId>>/v0/tm1/<<your-TM1-database-name>>/api/v1/Cubes('<<your-cube-name>>')/Rules/$value

which returns exactly that, your rules for that cube in text directly in your browser.

zied ben hmida's profile image
zied ben hmida

hi folks

@Wim Gielis – I’m already using HttpRequest instead of the Executecmd functions, but in my case I’m looking to generate the RUX into a flat file that I will later import.

@Hubert Heijkers – Thanks for your reply! Yes, I also saw that method online; it’s a good way to read the RUX even if it’s not very readable at first time.

In the meantime, I got a reply from IBM: there is indeed a 65 kbit size limit.

To be continued…

Edward Stuart's profile image
Edward Stuart IBM Champion

Hi Zied,

You can GET the cube rules and you can subsequently POST/ PATCH the cube rules pending what you are trying to achieve. 

ExecuteHttpRequest response values have a maximum size and if you exceed this size it will produce no response, so I always (unless I know the payload is very small) output the file to File Manager and then use POST to reload/ update objects from file manager on another instance/ server as required.

JSON takes a little while to get to grips with but I prefer this over formatting issues that may crop up with converting between json/ text each time 

zied ben hmida's profile image
zied ben hmida

Hi @Edward Stuart,

Thanks a lot for your detailed response much appreciated!

Yes, we’re currently facing a problem where some cube rules are too large to be exported as flat files due to their size. 

I’m definitely interested in the solution you mentioned using File Manager to output the rules and then POST them to another instance. Could you please share a bit more detail on how you implement this? For example:

  • Are there any specific considerations when reloading the rules via POST?
  • Any tips for handling large JSON payloads efficiently?

Thanks again for your help!

Best regards,

Hubert Heijkers's profile image
Hubert Heijkers

@zied ben hmida, I'll take this one as I added the JSON function to TI myself as well. Unfortunately that 64KiB (minus a couple of bytes) limit for strings plays a role in a while bunch of places and everywhere in TI where you, directly or indirectly, dealing with strings as well. The JSON functions are no exception here, the JSON passed in/out/around is still a string under the covers and as such, apart from not the most efficient, are limited to 64KiB as well. Typically that will suffice if used in combination with a JSON data source as in that case the individual values for each of the fields can each be up to 64KiB in size (read: every record can be bigger then that as long as the values you map variables to are not), but for building/handling large JSON documents yourself that limit might still be an issue.

Work arounds, when working with ExecuteHttpRequest specifically, would be to use a file to which you write too/read from as a data source once again. ExecuteHttpRequest for every request that returns a response allows you to put that response in a file instead of returning it in a string. Might help in cases where the response is not one big JSON document for example but some form of concatenated JSON which subsequently can be used as a JSON data source again. Equally you can use a file as the content for a request as opposed to a string variable. Using that pattern you can write whatever much you want to a file and then subsequently use that file in a ExecuteHttpRequest as the content.

One of these days we'll not only lift that limitation on string size (I'm hoping) but also make JSON more native to the (improved) TI language at which point the size of the JSON document shouldn't matter any longer either;-)

Oh, and the getting the text for the rules as I showed in my earlier comment is a good example how to get past that 64KiB limit to begin with ;-)

Edward Stuart's profile image
Edward Stuart IBM Champion

There are two workflows I am moving through:

Take Object from Development to Production:

  • From the Production instance, Run GET request to object on Development and save json to file manager on Production
  • Run a TI process with dynamic datasource pointing to generated json file above
  • Generate body for POST/ PATCH request (depending if object exists or not) and POST/ PATCH update to Production server

The second workflow includes comparing Objects between Development and Production and also saves down multiple iterations of objects to track changes over time. Thus far this has included a third instance which I've named Source Control which assists with pull/ push of objects (including data) between instances and date stamps objects.

However, whilst this can/ is working it is complex and I am investigating 'database as code' as an alternative to the traditional package migration and across to continuous integration/ development