Open Source Databases

OEM & Open Source Offerings

Connect with Db2, Informix, Netezza, open source, and other data experts to gain value from your data, share insights, and solve problems.

 View Only

Migrate Oracle database to GCP cloud SQL for PostgreSQL Using ora2pg Migration Runbook 

Mon September 08, 2025 12:30 PM

Guide to Migrate Oracle database to GCP cloud SQL for PostgreSQL

Using ora2pg Migration Runbook.

                                                                             Prepared by Khim Singh

                                                                                    Prepared by Khim Singh

 

This document is intended to be used as a Quick Installation Guide for Ora2pg installation. The tool is used to migrate schema & data from Oracle GCP Cloud SQL for PostgreSQL. More details about Ora2pg can be found here. This document contains two sections, has links to installation scripts that install ora2pg and a section that covers manual installation steps for those scenarios where Bash/PowerShell cannot be executed. Download respective oracle client and client-sdk.

Table of Contents

Objective. 3

Reference Architecture. 4

High-level overview of the migration procedure. 4

Prerequisites. 5

Google Cloud requirements. 5

Oracle installation (Optional) 6

Cloud SQL for PostgreSQL installation. 8

Migration Prerequisites. 9

Introduction. 9

Decision matrix on Bulk Migration to Cloud SQL using Ora2pg. 9

Generating Sample Data in Oracle for Migration using HR sample schema (Optional) 9

Ora2pg Installation and Configuration. 12

Installation Prerequisites. 12

Installation. 13

Configuration. 14

Postgres Client Installation. 15

Establish Connectivity between VM and Target Cloud SQL for PostgreSQL. 16

Migration. 16

Assessment of source database. 17

Schema Conversion. 18

Bulk Data Migration. 19

Validation. 21

Associated Tools for Migration. 24

Assessment Tool. 24

Data Validation Tool (Validation) 25

Troubleshooting. 25

Oracle installation or connection related Error while running Ora2pg. 26

Oracle Sample Schema HR failed to load. 26

Sizing guidelines. 26

Highlights

Purpose

Migration from Oracle to Cloud SQL for postgreSQL using Ora2Pg that include sample mock data and all necessary steps from initial assessment to validation

Version

1.0

Category

Cloud, Database Migration

Intended audience

Enterprise Architects, IT Security, Cloud Engineers

Key assumptions

The audience has basic technical skills - Google Cloud Platform, Oracle Database, Cloud SQL and gcloud CLI.

Supported Tech Stack

Oracle, Cloud SQL, Ora2pg and Google Cloud

tools/
framework

Oracle Server, Ora2pg, Cloud SQL, Google Cloud CLI, Heterogeneous migration.

Systems/Tools used

Source - Oracle:

  • Version: 18c Express Edition
  • OS: RedHat Linux 7

Target - Cloud SQL for PostgreSQL

  • Version: 14

Considerations

This runbook considers one-time full load migration from Oracle to Cloud SQL for PostgreSQL and doesn't support live migration. This runbook uses sample HR schema and data for migration. This runbook uses Ora2pg that can be leveraged on migrating different versions of Oracle databases from self-managed or AWS databases to Cloud SQL. Check out Here on Oracle Migration Decision matrix as per Source Database.

Objective

As a customer or user, you would want to migrate data from existing Oracle to Google Cloud SQL for PostgreSQL. There are many reasons why organizations might choose to migrate their Oracle databases to Cloud SQL for PostgreSQL. Google Cloud SQL for PostgreSQL is a fully managed service that makes it easy to set up, maintain, manage, and administer your PostgreSQL databases on Google Cloud Platform. It provides high availability, scalability, and data encryption to help organizations focus on their applications, instead of their infrastructure.

The Oracle to Cloud SQL for PostgreSQL migration process can be divided into the following steps:

  1. Plan migration: The first step in the migration process is to plan the migration. This includes identifying the data that needs to be migrated, the applications that need to be migrated, and the timeline for the migration.
  2. Migrate the data: Once the plan is in place, the next step is to migrate the data from Oracle to Cloud SQL. This can be done using various tools and methods. One of the tools we will uncover i.e Ora2pg
  3. Test and migrate the applications: Once the Database is migrated, Applications are tested, the next step is to migrate them to production on Cloud SQL with Monitoring.
  4. Monitor the migration: Once the applications are migrated, the next step is to monitor the migration and ensure that everything is working as expected.

Current runbook will share the steps outlined including assessment, Schema Conversion, Schema Transformation, Data Migration.

Reference Architecture

This migration workflow involves migrating data from a self-managed Oracle Express Edition installed on the GCE to Cloud SQL for PostgreSQL. To achieve this, we'll be using the Ora2pg tool, which is a free tool that automates the migration of an Oracle database to a PostgreSQL. Ora2pg connects to your Oracle database, automatically scans its structure and data, and generates SQL scripts that can be loaded into your PostgreSQL database. It supports bulk load migration and doesn't require any specialized knowledge of Oracle databases other than the necessary connection parameters. Ora2pg can be installed on the same machine as the Oracle database or on a separate machine, provided there is connectivity between the two machines. If you choose to install Ora2pg on a separate machine, you'll need to use a compatible Oracle database client. For this runbook, we've installed Ora2pg on the same machine as the Oracle database.

High-level overview of the migration procedure

High-level overview of the migration procedure using Ora2Pg:

·        Install Ora2Pg and initialize the migration project.

·        Set up source and target database connectivity.

·        Configure Ora2Pg migration parameters.

·        Generate a database migration report.

·        Export the database schema from the Oracle database.

·        Import the database schema into Cloud SQL for PostgreSQL.

·        Perform data migration.

·        Import indexes, constraints, foreign keys, and triggers into Cloud SQL for PostgreSQL.

·        Verify data integrity after the migration.

Prerequisites

To ensure a successful Oracle database migration, make sure you have met the following prerequisites.

System Prerequisites

Google Cloud requirements

We will be using Cloud SQL for PostgreSQL and Compute Engine gcp services. Make sure you have access to a project with these services enabled. You need the following permissions and roles granted to your principal to perform tasks.

Required permissions to launch and manage compute engine

compute.disks.create

compute.instances.create

compute.instances.get

compute.instances.getSerialPortOutput

compute.instances.setMetadata

compute.instances.setServiceAccount

compute.subnetworks.use

compute.subnetworks.useExternalIp

roles/iam.serviceAccountUser

Required permission to launch and manage Cloud SQL for SQL Server

cloudsql.instances.create

cloudsql.instances.get

cloudsql.instances.list

cloudsql.instances.update

roles/iam.serviceAccountUser

You can either use a local machine with gcloud shell installed or you can use cloud shell on the gcp console page.

Before performing any gcp api request using gcloud make sure you have authenticated and authorized gcloud. To authenticate and authorize gcloud run following commands one by one:

gcloud auth login

gcloud config set project <project-id> #replace <project-id> with actual project id

export ZONE=asia-south1-a

#to verify correct project id is set

gcloud config get project

# to verify correct zone is set

echo $ZONE

Oracle installation (Optional)

This is an optional step and can be skipped if you already have a Oracle installation.

In case you do not have a Oracle server, you can install Oracle Express Edition on Google Cloud Platform Compute Engine with the step by step installation instructions present in the link below:

https://docs.oracle.com/en/database/oracle/oracle-database/18/xeinl/procedure-installing-oracle-database-xe.html

Provision Red Hat Enterprise Linux 7 to install Oracle Express edition(Please note it will incur instance cost*)

To launch Red Hat Enterprise Linux 7 run below command:

gcloud compute instances create oracle-db \

    --zone=$ZONE \

    --image-project rhel-cloud \

    --image-family rhel-7 \

    --machine-type e2-standard-4 \

    --boot-disk-size 100 \

    --boot-disk-type pd-ssd \

    --provisioning-model=STANDARD \

    --scopes=https://www.googleapis.com/auth/cloud-platform

You should get output

Before proceeding to Oracle database installation you need to ssh into oracle-db instance. You can ssh from Google Cloud Console in the browser or use the below command to ssh from the terminal.

gcloud compute ssh --zone $ZONE "oracle-db"

Below are sample commands taken from documentation to download and install Oracle 21c Express on GCE(Red Hat Enterprise Linux 7 , with minimum storage 40gb). You should run these commands after ssh is done.

sudo -s

curl -o oracle-database-preinstall-18c-1.0-1.el7.x86_64.rpm https://yum.oracle.com/repo/OracleLinux/OL7/latest/x86_64/getPackage/oracle-database-preinstall-18c-1.0-1.el7.x86_64.rpm

yum -y localinstall oracle-database-preinstall-18c-1.0-1.el7.x86_64.rpm

Download Oracle 18c Express edition from below mentioned link. https://www.oracle.com/database/technologies/xe18c-downloads.html

#install wget

yum update

yum install wget

#download oracle 21c express database

wget https://download.oracle.com/otn-pub/otn_software/db-express/oracle-database-xe-18c-1.0-1.x86_64.rpm

#Install Oracle database. (It will take several minutes)

yum -y localinstall oracle-database-xe-18c-1.0-1.x86_64.rpm

#Configure Oracle Database post Installation. You will be asked to enter a password. Note that the same password will be used for SYS, SYSTEM and PDBADMIN accounts:

/etc/init.d/oracle-xe-18c configure

#set ORACLE_HOME

export ORACLE_SID=XE

export ORAENV_ASK=NO

. /opt/oracle/product/18c/dbhomeXE/bin/oraenv

ORACLE_HOME = [] ? /opt/oracle/product/18c/dbhomeXE

#output: The Oracle base has been set to /opt/oracle

To check installation run

sqlplus system

You will be asked to enter a password. Enter password that you configured during installation.

Cloud SQL for PostgreSQL installation

We are going to launch Cloud SQL for postgreSQL . This will be our target database. To launch cloud sql for postgreSQL run below command.

gcloud sql instances create target-postgres      --database-version=POSTGRES_14   --cpu=4     --memory=16     --storage-size=100     --root-password=Test1234     --zone=$ZONE

You should get output:

You can also use google cloud console ui to install source oracle db on VM and target postgres server on Cloud SQL.

Migration Prerequisites

Introduction

This Runbook guides you through a migration proof of concept. You are going to load sample data in the source database, configure Ora2pg and establish connectivity between source and target database. You are going to conduct assessment, schema conversion, schema transformation and finally bulk data movement.

Decision matrix on Bulk Migration to Cloud SQL using Ora2pg

When migrating your data, it's essential to have a complete understanding of the source database version and its compatibility with Ora2pg. If you're aiming for minimal downtime and require initial load and CDC, consider utilizing Google Cloud DMS instead. Additionally, the PL/SQL code generated for Oracle-specific functions, procedures, packages, and triggers must be reviewed and adapted to match the PostgreSQL syntax.

Generating Sample Data in Oracle for Migration using HR sample schema (Optional)

You will use Oracle HR as a sample schema to migrate to Cloud SQL for PostgreSQL. We can find sample data for Oracle on the github repo.

https://github.com/oracle-samples/db-sample-schemas

# ssh into oracle-db vm

gcloud compute ssh --zone $ZONE "oracle-db"

#switch to root user

sudo -s

#install git

yum install git

#clone repo

git clone

https://github.com/oracle-samples/db-sample-schemas

#change directory to schema folder

cd db-sample-schemas

#list file, you will see .sql,.bak files prefixed with hr_

ls human_resources/

 

We will login using the system into Oracle and load only sample HR schema into pluggable databases. We need to provide necessary details like tablespace, password for hr schema, and database to deploy sample schema. Refer below link for further details on sample creation.

https://github.com/oracle-samples/db-sample-schemas/blob/main/README.txt

Please note, before loading the HR schema, we will need to Change all embedded paths to match your working directory.

https://github.com/oracle-samples/db-sample-schemas#23-change-all-embedded-paths-to-match-your-working-directory

perl -p -i.bak -e 's#__SUB__CWD__#'$(pwd)'#g' *.sql */*.sql */*.dat

# you should run this command after changing directory to db-sample-schemas

Now you are going to load the schema.

#connect to oracle database

#sqlplus system/<<System_Password>>@<<PluggableDB>>

sqlplus system/********@localhost:1521/XEPDB1

#execute hr_main.sql 

@ <location of hr_main.sql>

#replace <location of hr_main.sql>

exit #Exit from sql console.

Login with HR schema and verify objects loaded into the schema. As part of Spanner we will only migrate tables, constraints and indexes along with data.

#sqlplus hr/<<hr_schema_password>>@<<PluggableDB>>

sqlplus hr/hr@localhost:1521/XEPDB1

select object_type , count(1) from user_objects group by object_type;

select user from dual;

exit  #Exit from sql console

cd .. #change working directory

You should see output:

Ora2pg Installation and Configuration

Ora2Pg is a free tool and widely used to convert and migrate an Oracle database to a PostgreSQL compatible. It connects your Oracle database, scans it automatically and extracts its structure or data, then generates SQL scripts that you can load into your PostgreSQL Compatible database. For details on many features and configurations please refer https://github.com/darold/ora2pg .

We will set up ora2pg on the same machine as Oracle and configure all prerequisites.

Installation Prerequisites

To install prerequisites for Ora2pg, ssh into the oracle-db vm and run following commands.

sudo -s

yum -y install perl-CPAN

yum groupinstall "Development Tools"

yum -y install perl-DBI

export TNS_ADMIN=$ORACLE_HOME/network/admin

Install the DBD::Oracle Perl module:

perl -MCPAN -e 'install DBD::Oracle'

If you get an error download the package from cpan and manually build it. Follow the below command to download and build it.

wget https://www.cpan.org/modules/by-module/DBD/MJEVANS/DBD-Oracle-1.75_2.tar.gz

tar xvzf DBD-Oracle-1.75_2.tar.gz

cd DBD-Oracle-1.75_2/

perl Makefile.PL

make

make install

cd ..

Install Perl DBI for PostgreSQL connectivity

yum install perl-DBD-Pg

Installation

Follow below steps to install and configure Ora2pg. You will be using Ora2pg version 23.2

  1. Download Ora2pg

curl https://codeload.github.com/darold/ora2pg/zip/refs/tags/v23.0 -o ora2pg.zip

unzip ora2pg.zip

cd ora2pg-23.0/

  1. Make and Install

perl Makefile.PL

make && make install

cd ..

  1. Test installation

export PATH=$PATH:/usr/local/bin/

ora2pg -version

  1. Check for Ora2PG configuration file and rename if needed.

Ora2pg configuration files govern overall setting in terms of connection or conversion and are required for all processing or operation with ora2pg tool.

mv /etc/ora2pg/ora2pg.conf.dist /etc/ora2pg/ora2pg.conf

Configuration

  1. Create a configuration file

touch ora2pg_conf.conf

  1. Copy below configuration in ora2pg_conf.conf. Please ensure to replace the password. The password referred to here is the same one that you created during installation.

ORACLE_DSN      dbi:Oracle:host=localhost;service_name=XEPDB1;port=1521

ORACLE_USER     system

ORACLE_PWD      Test1234

SCHEMA  HR

TYPE TABLE

FILE_PER_INDEX 1

FILE_PER_CONSTRAINT 1

PG_VERSION 14

  1. Test oracle db connection

ora2pg -t SHOW_VERSION -c ora2pg_conf.conf

# you should see output Oracle Database 18c Express Edition Release 18.0.0.0.0

  1. Create template project for assessment and migration

ora2pg --project_base . --init_project ora2pg_mig -c ora2pg_conf.conf

Upon executing the command, you should be able to see output like this

Postgres Client Installation

You will install a psql, a command line tool for setting PostgreSQL client for deploying necessary converted code and data post running Ora2pg tool for conversion.

Follow below steps to setup PostgreSQL client libraries that include psql command line.

sudo yum install -y https://download.postgresql.org/pub/repos/yum/reporpms/EL-7-x86_64/pgdg-redhat-repo-latest.noarch.rpm

sudo yum install -y postgresql14

#to check installation

psql --version

Establish Connectivity between VM and Target Cloud SQL for PostgreSQL

Before proceeding to migration we need to establish connectivity between VM on which oracle db is hosted target postgreSQL server. Run below commands to establish connectivity. You can run these commands on a local machine with gcloud shell installed or you can run in the cloud shell.

vm_source_ip=$(gcloud compute instances describe oracle-db \

    --zone=$ZONE | grep natIP | awk '{print $2}')

echo $vm_source_ip

gcloud sql instances patch target-postgres --authorized-networks=$vm_source_ip

In your actual production environment, you might have to create a vpn connection or use an interconnect to establish connectivity between on prem oracle db and GCP Cloud SQL. To test connectivity run the below command:

psql -h <cloud-sql-ip> -U postgres

# replace <cloud-sql-ip> and enter password when prompt.

Migration

This section describes the step-by-step process of setting up a schema and data migration from Oracle to Cloud Spanner using Harbour Bridge as Bulk Migration. The steps are:

  • Create source connection profiles to Oracle or use Direct Connection
  • Assessment of Migration
  • Schema Migration with Spanner best practices using Harbour Bridge (CLI and Web)
  • Data Migration to curated Spanner Schema.
  • Verify and Validation migration for all Database related objects including data.

Assessment of source database

A detailed assessment provides a clear understanding of the current database environment and helps to identify potential risks and challenges, and evaluate the required resources, effort and timeline for the migration. The assessment process also helps to identify any dependencies, compatibility issues and data migration challenges, which can be addressed before starting the migration process. Furthermore, the assessment process helps organizations to choose the right migration strategy and tools, ensuring a smooth and successful migration, while minimizing downtime and business disruption. Ora2pg offers capabilities to run initial assessment on overall oracle schema to understand overall complexity. It is well suited to Oracle Schema assessment but not for procedural components. You can use SHOW_REPORT to create an initial assessment report and export it as an HTML file. To create assessment report follow below steps

  1. Change the working directory to the orag2pg_mig directory. It is important to note that this directory was created during project initialization, as described in step 4 of the configuration section.

cd ora2pg_mig

  1. Run the below command to generate an assessment report as html. This command will save the report in reports/assessment_report.html file. You need to download and open the file in the browser to view the report.

ora2pg -t SHOW_REPORT -c config/ora2pg.conf --dump_as_html --cost_unit_value 10 --estimate_cost > reports/assessment_report.html

  1. Run the below command to generate an assessment report in the terminal.

ora2pg -t SHOW_REPORT -c config/ora2pg.conf  --cost_unit_value 10 --estimate_cost

You should see output (note that I have cropped output)

Migration report helps us to understand the underlying Oracle Schema and overall objects involved as part of Migrations. For Sample schema HR overall migration complexity is A-3 i.e. can be automatically performed without manual code conversion with simple category. You can also use SHOW_TABLE and SHOW_COLUMN to extract additional details on Table and column with data type mapping between Oracle and PostgreSQL.

ora2pg -t SHOW_TABLE -c config/ora2pg.conf

ora2pg -t SHOW_COLUMN -c config/ora2pg.conf

Once the assessment phase is completed, we will start with the Conversion phase using Ora2pg

Schema Conversion

The export_schema.sh script is included in the project template and facilitates the conversion of Oracle objects such as procedures, functions, and tables. By utilizing the -t option, it is feasible to convert particular types of objects from Oracle to PostgreSQL. The conversion process is governed by several options that are available as part of the ora2pg configurations. For additional information, please refer to the documentation.

Run below command to export schema.

./export_schema.sh

Upon executing the export_schema.sh script, all converted code will be located in the ./schema director 

You might need to install the tree cmd tool using the ***sudo yum install tree*** command.

Throughout the conversion phase, it is essential to resolve all functionality and develop migration patterns to handle any complex Oracle features that are not compatible with PostgreSQL. This can be a lengthy process, contingent on the procedural or schema complexity of the Oracle system. Based on the migration strategy, you may intend to implement indexes and constraints at a later stage after the data migration. In the subsequent phase, the converted code will be deployed to the CloudSQL target.

Bulk Data Migration

When importing the schema and data into CloudSQL, the following deployment strategy should be followed:

  • Deploy the converted table DDL and sequences.
  • Deploy procedural code, including views and procedures.
  • Migrate data using a one-time migration approach utilizing the PostgreSQL native copy command.
  • Deploy indexes, constraints, and triggers during the final stage.

​​

The following steps need to be skipped:

  1. Creating the owner of the database postgres: Since the postgres role already exists in Cloud SQL, this step can be skipped. However, if the postgres role does not exist or has been deleted, then this step should not be skipped.
  2. Importing TABLESPACE: As importing TABLESPACE requires superuser permission, and superuser permission is restricted in Cloud SQL, this step must be skipped. More information about superuser restrictions can be found on the Cloud SQL documentation page.

The ora2pg library includes import_all.sh, which simplifies the overall deployment process by providing interactive command lines. We will use it to deploy code and migrate data according to the flow outlined above.

A close-up of a black and white background

AI-generated content may be incorrect.

Run below command to start migration.

PGPASSWORD=<cloud sql password> ./import_all.sh -d hr -U postgres -h <cloud sql ip>  -o postgres

#Replace <cloud sql password> with actual password

#Replace <cloud sql ip> with cloud sql instance ip

Interactive command lines inputs:

Would you like to create the owner of the database postgres? [y/N/q] N

Would you like to create the database hr? [y/N/q] y

Would you like to import SEQUENCE from ./schema/sequences/sequence.sql? [y/N/q] y

Would you like to import TABLE from ./schema/tables/table.sql? [y/N/q] y

Would you like to import VIEW from ./schema/views/view.sql? [y/N/q] y

Would you like to import TRIGGER from ./schema/triggers/trigger.sql? [y/N/q] N

Would you like to import PROCEDURE from ./schema/procedures/procedure.sql? [y/N/q] y

Would you like to process indexes and constraints before loading data? [y/N/q] N

Would you like to import TABLESPACE from ./schema/tablespaces/tablespace.sql? [y/N/q] N

Would you like to import data from Oracle database directly into PostgreSQL? [y/N/q] y

After data import is complete:

Would you like to import indexes from ./schema/tables/INDEXES_table.sql? [y/N/q] y

Would you like to import constraints from ./schema/tables/CONSTRAINTS_table.sql? [y/N/q] y

Would you like to import TRIGGER from ./schema/triggers/trigger.sql? [y/N/q] y

As outlined earlier, the steps to create the owner of the database postgres and import tablespace were skipped. Additionally, import indexes, constraints, and triggers were performed after migrating the data.

Validation

At this point you've installed and configured the source database, installed and configured the Ora2pg tool, performed schema conversion and bulk load migration using Ora2pg. Now as final steps we will perform Validation of data migrated.

Validation using Ora2pg

The ora2pg tools include modules for testing and validating various migration components such as schema, data, and row validation between Oracle and PostgreSQL. Some of the listed modules are:

  • TEST_COUNT: performs a row count diff between the Oracle and PostgreSQL tables.
  • TEST_VIEW: performs a count on both sides of the number of rows returned by views.
  • TEST_DATA: performs data validation checks on rows on both sides.
  • TEST: performs a diff between the Oracle and PostgreSQL databases.

You can use this module to perform tests. Before performing validation test you need to add PG_DSN in ora2pg.conf file, append below configuration to the ora2pg.conf file

PG_DSN dbi:Pg:dbname=hr;host=<cloud sql ip>;port=5432

PG_USER postgres

PG_PWD <password>

#replace <cloud sql ip> and <password> with actual value

Config file should look like this.

To run validation run below commands:

  1. TEST_COUNT

ora2pg -t TEST_COUNT -c config/ora2pg.conf

Expected output

  1. TEST_VIEW

ora2pg -t TEST_VIEW -c config/ora2pg.conf

Expected Output

[UNITARY TEST OF VIEWS]

ORACLEDB:emp_details_view:106

POSTGRES:emp_details_view:106

  1. TEST_DATA

ora2pg -t TEST_DATA -c config/ora2pg.conf

Results are written on data_validation.log file. Expected output:

Data for the employee table is not matching. If we closely analyze we can see there is no difference in data. But, there is a difference in formatting. For example in oracle data is stored as 0.1 but in postgres it is stored as 0.10.

  1. TEST

ora2pg -t TEST -c config/ora2pg.conf

Here is an excerpt of the command output showing one of the many test results that passed the verification:

[TEST TABLE COUNT]

ORACLEDB:TABLE:7

POSTGRES:TABLE:7

[ERRORS TABLE COUNT]

OK, Oracle and PostgreSQL have the same number of TABLE.

Associated Tools for Migration

There are many tools and utilities that can help you in the Migration Journey. Any migration journey involves different phases from assessment to validation and it's helpful to have tools as armor in each phase. Below are some of such tools that facilitate a core functionality and accelerate migration journey.

Assessment Tool.

Database migration can be a complex and time-consuming process, and it is important to have the right tools in place to make the process as smooth as possible. One of the most important tools for database migration is an assessment tool.

Apart from its migration purpose, Ora2pg has the added capability of functioning as an assessment tool. With its extensive array of features, all of which are listed here, Ora2pg provides multiple options for assessment. Of these, some of the key features that can be leveraged for this purpose include:

  • Show a report of an Oracle database content.
  • Migration cost assessment of an Oracle database.
  • Migration difficulty level assessment of an Oracle database.
  • Migration cost assessment of PL/SQL code from a file.
  • Migration cost assessment of Oracle SQL queries stored in a file.

You can view a sample report here.

Another such tool is migVisor.migVisor helps identify the best migration path for each database, using sophisticated scoring logic to rank databases according to the complexity of migrating to a cloud-centric technology stack. Users get a customized migration roadmap to help in planning.

https://cloud.google.com/blog/products/databases/faster-google-cloud-database-migration-assessments-with-epams-migvisor

Data Validation Tool (Validation)

https://github.com/GoogleCloudPlatform/professional-services-data-validator

The Data Validation Tool is an open sourced Python CLI tool that compares heterogeneous data source tables with multi-leveled validation functions.

DVT supports the following validations:

  • Column validation (count, sum, avg, min, max, group by)
  • Row validation (BQ, Hive, Teradata, Oracle, SQL Server, Postgres only)
  • Schema validation
  • Custom Query validation
  • Ad hoc SQL exploration

Troubleshooting

sqlplus command not found

Make sure oracle home is exported properly. You can re-rerun below command to export oracle home again

#set ORACLE_HOME

export ORACLE_SID=XE

export ORAENV_ASK=NO

. /opt/oracle/product/18c/dbhomeXE/bin/oraenv

ORACLE_HOME = [] ? /opt/oracle/product/18c/dbhomeXE

#output: The Oracle base has been set to /opt/oracle

Oracle installation or connection related Error while running Ora2pg.

If you encounter exceptions related to the installation or connection of the Oracle database while running Ora2pg, ensure that the Oracle home is correctly exported. You can re-run the following command to export the Oracle home again.

#set ORACLE_HOME

export ORACLE_SID=XE

export ORAENV_ASK=NO

. /opt/oracle/product/18c/dbhomeXE/bin/oraenv

ORACLE_HOME = [] ? /opt/oracle/product/18c/dbhomeXE

#output: The Oracle base has been set to /opt/oracle

Oracle Sample Schema HR failed to load.

Please change the embedded path to match the working directory before running hr_main.sql to load sample schema.

https://github.com/oracle-samples/db-sample-schemas#23-change-all-embedded-paths-to-match-your-working-directory

SP2-0310: unable to open file "hr_main.sql"

Make sure you are providing the correct path while executing hr_main.sql. Change directory to db-sample-schemas and run echo "$PWD/human_resources/hr_main.sql" Copy printed path.

Further Reads

Ora2pg Documentation

https://ora2pg.darold.net/documentation.html

Sizing guidelines

CloudSQL sizing requires thorough testing with application performance SLA and measure before production. Some of the key drivers to decide on any database instance sizing is understanding of IOPS and Throughput requirements.CloudSQL provision console page provides overview of IOPS and Throughput we can avail based on different instance configurations that includes.

  • vCPU
  • Memory
  • Storage Type
  • Storage Capacity.

CloudSQL also supports recommenders that provide necessary recommendations on instances being over or under provisioned.

Limitations

  • As mentioned earlier, Ora2pg only supports bulk load, so if you are looking for a cdc based system for minimal downtime you should choose other tools for example DMS etc. You can still use Ora2pg for assessment and schema conversion.
  • While Ora2Pg does its best to automate the conversion of your Oracle database to PostgreSQL, some manual work is still required. Specifically, the PL/SQL code generated for Oracle-specific functions, procedures, packages, and triggers must be reviewed and modified to align with the PostgreSQL syntax. For helpful suggestions on porting Oracle PL/SQL code to PostgreSQL PL/PGSQL, you can refer to this resource, which contains useful recommendations.

Statistics
0 Favorited
2 Views
1 Files
0 Shares
2 Downloads
Attachment(s)
pdf file
Migrate Oracle database to GCP cloud SQL for PostgreSQL U...   2.49 MB   1 version
Uploaded - Mon September 08, 2025
Migrate Oracle database to GCP cloud SQL for PostgreSQL Using ora2pg Migration Runbook