Hello all! First article as my second-year IBM Champion. I hope that you can appreciate my latest process as much I do it.
I add it in the files section of the Community: https://community.ibm.com/community/user/businessanalytics/viewdocument/exporting-clearing-and-importing-d?CommunityKey=8fde0600-e22b-4178-acf5-bf4eda43146b&tab=librarydocuments
Change the file extension from txt to pro.
There is a lot of descriptive text in the Prolog tab. It essentially comes down to this. The process is a wrapper process around 3 key Bedrock processes related to data export, data clear, data import. The idea was having an automated way of data exports from several cubes but where key dimensions are shared: think about year, period, scenario, company, … You can now easily export all data, by cube, and by combination of such dimensions. For instance, Budget 2024, 1 monthly file, and for 20 companies this would mean 240 files. Hence, Budget and actuals 2024 for the same configuration would return 480 files – without effort at all. This would be for 1 cube. If you do 3 (income statement, balance sheet, cashflow), you see that without coding this exercise becomes a breeze.
The Bedrock processes are leveraged meaning that you can playt around with its parameters and settings. They are exposed in the Prolog tab of this new process.
Exporting data is not the only thing that is done here. For the same selections you can clear data from the cubes (be careful!) as well as import data into the cubes coming from the flat files. You do not need to export first. All actions can be done, or just separate actions to stay in control.
I allow for a filter selection, like Bedrock does, and additionally up to 5 'slicer' dimensions: dimensions in which your selections generate separate files. Year = 2020:2024 would be 5 files, TIMES Period = 01:12 would be 60 files already, TIMES Scenario = Actual + Budget would give you 120 files. For 1 single cube. Archiving data has never been easier than this. Importing data even easier.
So… 1 and 1 makes 3. The full cycle of exporting data with rules-calculated cells, clearing data, [changing rules and feeders] and again importing the hard values into stet'ed cells is possible. Changing rules and feeders is put between brackets because that is still the responsibility of the TM1 administrator or developer. Automation is harder there (but not impossible).
The wrapper process uses a few other processes, which you need to add to your TM1 model first. Obviously, the processes to clear, export, import data. Also the process to populate a public subset. Next to Bedrock processes, there is another one (WG_CUBE_lookup_dimensions_in_cube) – very useful – to check whether all dimensions that you provide, are part of a given cube. This is important because clearing/importing data into cubes that do not have all 'slicer' dimensions could lead to data risks.
Exported files can have a timestamp. You can indicate that you do, or do not, want to overwrite existing files. The timestamp in files is taken into account when checking for existing files.
Last but not least, emailing of the different steps within the chain is set up. I leave it up to the reader to add his/her favourite process to send out emails. Mine is called 'TECH_send email' so you can look at the code to see how to fit it in. If you don't want or need it, just leave the parameter 'pSendEmailAtStages' empty.
Important
One change to TM1 Bedrock, for the export process, is still pending. You need it to be able to use the export feature of the process. See the pull request here: https://github.com/cubewise-code/bedrock/pull/425
The change is very minor, though, in case you already want to play with it. You need: NumericGlobalVariable( 'nDataCount' ); in the Prolog tab of the Bedrock cube data export process.
Let's spur the discussion below in the comments section on how you use this wrapper process. Maybe you can build in the data copy process of Bedrock for slices in cubes ?
Have fun! Please give due credit and/or leave my name inside, to name just 1 thing, in case of a remark or issue I can be contacted.
#IBMChampion