Db2

 View Only

How to Move a Db2 Linux Database to another Db2 Instance or db2u Container Deployment in One Command

By Phil Downey posted Wed July 06, 2022 07:35 AM

  

DBAs for a long time have wrestled with the task of how to move a database from one server to another. Do I export / import, Backup / Restore or do I unmount / remount the storage onto a different server.  Along with the task of spinning up test environments for development and testing this becomes quite a time consuming and laborious task. If only databases could be cloned and shipped easily to another database instance on the cloud, in a OpenShift, Kubernetes cluster, IBM Cloud Pak for Data or even just to another db2instance in your data centre.


 Now it can be achieved with a single command line:

db2shift --source-dbname=db2oltp --dest-type=POD  --kubectl --dest-server=c-demo2-db2u-0 –-mode=all –-threads=4

The above moving a database from a standard Linux instance to a Db2 on Cloud Pak for data or straight OpenShift deployment.

 

db2shift --source-dbname=db2oltp --dest-type=other  --ssh --dest-server=192.168.0.4 –dest-owner=db2inst2  –-mode=move –-threads=4

 The above moving a database from a standard Linux instance to another Db2 instance (db2inst2) over ssh.

The db2shift utility is available through the IBM Db2 Click to Containerize version 2.1 currently in Technical Preview, downloads available here ibm.biz/c2cdownload.

Key Capabilities

  • Db2 instance to Db2u / Cloud Pak Db2 Services on Kubernetes/OpenShift
  • Db2 instance to Db2 instance via password less ssh
  • Automated setup of HADR for Sync until swap over
  • Stored Clones for air-gapped or rapid deploy Dev/test environments or Dev-Ops scenarios
  • Automated Upgrades between versions
  • Sources and Targets can be on X86 Linux and or Power Linux LE (ppcle)

 

The tool can move Db2 databases to and from instances in the db2u Kubernetes services (pod) or a standard Db2 instance install, whether they are deployed on a traditional server, cloud vm or Kubernetes or OpenShift clusters including Db2u supported clusters e.g. (IBM OpenShift, AWS EKS and ROSA). It has also been built to operate against db2u Services on IBM Cloud Pak for Data, integrating the database into the IBM Cloud Pak for Data  services.

 The tool can copy from 10’s of GB to terabytes of data depending on the Network and Compute/Storage speeds available.

 


Alternatively, clones can be taken on to disk and either moved to another data centre and applied to an existing database instance. Clones can also be used more than once for different scenarios such as spinning up test and development databases.

 

 

Whether a database is copied online over the network or via a storage-based clone the tool will allow HADR or Database version upgrades to be performed. Databases can also be renamed in flight as part of the process.

Database moves can be done while the database is online and accepting database connections, however while connections are tolerated it is suggested workloads be suspend or only light queries take place during the clone as it may force a database log or container to be re-copied as the cloned copy is finalised, prolonging the copy period.

 

As per above, the tool is available as a command line utility or via the terminal UI, all through a single executable. The UI takes you through a terminal interface allowing you to navigate through the process of moving or cloning a copy of your database to another server and setting up HADR to sync if you need it to.

 

During the process the tool analyses the source and destination for compatibility for you providing differences you should consider before the move, then making any updates you choose to apply to the destination  DBM and DB configurations.

For example, if you are moving a database to a system that is not configured for intra-partition parallelism, when the source system is, you most likely will want to ensure the DBM configuration is updated.

As you execute the tool checks the source and target environments are available for the move, and then provides a constant status feed of where the process is at.

 

 

At the end, after all processes are completed, the run log and all database settings used in the move are copied to an archive log sub-directory in the run directory. This ensures DBA’s have a full audit of the process.

So if you want to simplify your database moves, create a library of Clones for your Development and Test environments, Try Db2 Click to Containerize 2.1, currently  available in technical preview here  ibm.biz/c2cdownload.

 

 

 

 

 

 

 

 

 

 

 

 


#Db2
0 comments
35 views

Permalink