Planning Analytics with Watson

Managing Backups in the Cloud

By Errol Brandt posted Sun May 16, 2021 12:19 AM

  

Managing Backups in the Cloud


One of the main benefits for moving from on-premise to the cloud is that the server maintenance activities are peformed by the application experts. In the event of a failure, it is reassuring to know that we can ask IBM to restore the system back as of a certain point in time.

With such support available, I guess you could be forgiven for thinking that cloud clients don't need to worry about such things, but it's not quite as simple as that. In my view, the IBM backups are the absolute last line of defence - the kind of thing you use only when something catastrophic has occured. Administrators should still activity manage their backups to minimise the risk of business interruption. It's far easier to restore a single cube or dimension from a backup than restoring a whole database.

In addition to keeping the database safe, backups provide a useful historical record. This can be really helpful when trying to troubleshoot problems, especially when you need to pointpoint when something is likely to have occurred.  

How to make backups on the cloud

There are a couple of good articles on performing database backups from exploringtm1 and perficient but there is nothing that I could find specifically on the cloud.  For this reason I thought I'd share what we have implemented, and would love to hear what others are doing.

Our technique consists of three components, which I will explain below:

1.  Folder Structure

By logging into the remote desktop I created a folder called 'Backup' within our TM1 database folder. This is held at s:\prod\xxxx\ folder (where xxxx represents your database name, e.g. tm1).



Under the 'Backup' folder, I created two subfolders - "Daily" and "Monthly".  


  • Under the Daily folder I have created 31 sub-folders.  These are named "01", "02".... "31", each representing a day of the month. 
  • Under the Month folder I have created 12 sub-folders.  These are also named "01", "02".. "12", representing the month of the year.   
I decided against creating a yearly folder, but I guess this logic could also be extended to do this if that was considered important.

This folder structure allows me to keep a daily backup for up to 31 days, and a monthly backup for up to 12 months.  By constantly overwriting old files, we can determine the maximum size of the folder as ( 31 + 12 ) x backup size.  This is important information to help you ensure that the server does not run out of storage space.

2. Powershell Script

The next component is a TI process that generates and executes a Powershell script. 

Our On-Premise server backup process relied on a third-party archive tool to generate the files. It worked nicely for the most part, but we would occasionally have issues with the program locking up random database files and causing the entire overnight process to hang.   Now that we are on the cloud, I tend to avoid third-party tools and stick with Powershell scripts. This is consistent with my previous post about sending notification emails to users.  

The TI takes pModel as an input parameter and expects to find a folder called  \data under this name. It also expects there to be a \scripts folder at this location so it can build and execute the Powershell script.

#==============================================================================
# sys.model-archive - PowerShell Model Archive Utility
#==============================================================================
#
# PARAMETER pModel - specifies the database name
#
#==============================================================================
# Constants
#==============================================================================

sYear = TIMST(NOW(), '\Y',1); sMonth = TIMST(NOW(), '\m',1); sDay = TIMST(NOW(), '\d',1);
sHour = TIMST(NOW(), '\h',1); sMin = TIMST(NOW(), '\i',1); sSec = TIMST(NOW(), '\s',1);

#==============================================================================
# Validate Model parameter
#==============================================================================

IF (pModel @='');
ProcessBreak;
ELSE;
sModel = pModel;
ENDIF;

sScript = 's:\prod\' | sModel | '\Scripts\model-archive-' | sYear | sMonth | sDay | sHour | sMin | sSec | '.ps1';

#==============================================================================
# Output character
#==============================================================================

DatasourceASCIIQuoteCharacter = '';

#==============================================================================
# Build Powershell Scripts
#==============================================================================
Asciioutput(sScript, '');
Asciioutput(sScript, '# Powershell model archive script');
Asciioutput(sScript, '');
Asciioutput(sScript, '');

# Bring in the file compression assemblies

Asciioutput(sScript, '');
Asciioutput(sScript, 'Add-Type -assembly ' | CHAR(39) | 'System.IO.Compression' | CHAR(39) );
Asciioutput(sScript, 'Add-Type -assembly ' | CHAR(39) | 'System.IO.Compression.FileSystem' | CHAR(39) );
Asciioutput(sScript, '');

# Set compression level

Asciioutput(sScript, '');
Asciioutput(sScript, '[string]$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal' );
Asciioutput(sScript, '');

# Source directory

Asciioutput(sScript, '');
Asciioutput(sScript, '[string]$folder = ' | CHAR(39) | 's:\prod\' | sModel | '\data' | CHAR(39) );
Asciioutput(sScript, '');

# retrieve the date strings which are used to determine
# where the backups are stored
#
# Daily backups are stored in their own folder, which
# is overwritten each day
#
# Monthly backups are stored in their own folder, and
# are only generated on the 15th day of the month
# These are overwritten each year

Asciioutput(sScript, '');
Asciioutput(sScript, '[string]$day= (Get-Date).ToString('| CHAR(39) | 'dd' | CHAR(39) | ')');
Asciioutput(sScript, '[string]$month = (Get-Date).ToString('| CHAR(39) | 'MM' | CHAR(39) | ')');
Asciioutput(sScript, '');

# Build Daily Zip File
# Build Monthly Zip File

Asciioutput(sScript, '');
Asciioutput(sScript, '[string]$ZipFileDay = ' | CHAR(39) | 's:\prod\' | sModel | '\backup\Daily\' | CHAR(39) | ' + $day + ' | CHAR(39) | '/Archive.zip' | CHAR(39) );
Asciioutput(sScript, '[string]$ZipFileMonth = ' | CHAR(39) | 's:\prod\' | sModel | '\backup\Monthly\' | CHAR(39) | ' + $month + ' | CHAR(39) | '/Archive.zip' | CHAR(39) );
Asciioutput(sScript, '');

#==============================================================================
# Daily Process
#==============================================================================

Asciioutput(sScript, '');
Asciioutput(sScript, 'if (Test-Path $ZipFileDay) { Remove-Item $ZipFileDay }');
Asciioutput(sScript, '');
Asciioutput(sScript, '[System.IO.Compression.ZipFile]::CreateFromDirectory($folder, $ZipFileDay'| ')');
Asciioutput(sScript, '');

#==============================================================================
# Monthly Process
# Hard coded to run on 15th of the month
#==============================================================================

Asciioutput(sScript, '');
Asciioutput(sScript, 'if ($day=' | CHAR(39) | '15' | CHAR(39) | '){' );
Asciioutput(sScript, '');
Asciioutput(sScript, ' if (Test-Path $ZipFileMonth) { Remove-Item $ZipFileMonth } ');
Asciioutput(sScript, '');
Asciioutput(sScript, 'Copy-Item $ZipFileDay -Destination $ZipFileMonth' );
Asciioutput(sScript, '');
Asciioutput(sScript, ' }' );
Asciioutput(sScript, '');

#==============================================================================
# Execute Powershell script through a command statement
#==============================================================================

sCommandScript = 'powershell -ExecutionPolicy Bypass -File ' | sScript;
ExecuteCommand( sCommandScript , 1 );
LogOutput('info', sCommandScript );

#==============================================================================
# Cleanup
#==============================================================================

sCommandScript = 'CMD /C DEL ' | sScript;
LogOutput('info', sCommandScript );
ExecuteCommand(sCommandScript,1);



3. Overnight Chore

The backup process described above (sys.model-archive) is run as a part of the overnight chore.    As you can see below, this happens immediately following a complete database save.  Once that is done, we run the process above, before doing the rest of the overnight activities. 




Once implemented, the technice is set-and-forget.  You do not need to worry that the folders will grow out of control, and the folder structure makes it very easy to locate the file you need.  I would also say that I have not had a single instance of the backup process causing the overnight process to hang, so I feel the Powershell method is inherently more stable.

Conclusion

Even though IBM will do a good job at managing the application infrastructure, I believe it's still important for users to to make regular database backups for their own piece of mind. Not only can local backups save time in the event of a minor failure, they can prove invaluable when trying to pinpoint the time and source of database issues.  

I have described a technique that works well for us, but I would love to hear in the comments what others are doing.  Please feel free to describe what you have found, and whether this post has been useful for you.

2 comments
24 views

Permalink

Comments

Sun May 30, 2021 10:23 AM

HI Errol,

Thanks for sharing.

I have something similar, using a batch process to 7Zip what I need into the backup folder with a date and timestamp instead of multiple folders.

A TI process drives the execution and passes parameters for the model name and environment which are used in the creation of the filename e.g.
ACME_Prod_20210530.7z

A secondary process runs after this and prunes back-ups older than a certain period whilst always retaining the month-end backup. WildcardFileSearch and AsciiDelete are useful here.  The same process also clears up any log or error logs made by the backup process.

I can then use CCC to pick up any backups should I need to keep them locally (size dependent of course)

Wed May 19, 2021 06:09 PM

Perfect! congrats!