InfoSphere Optim

 View Only
  • 1.  Issue with column's having ORA_NUMBER Data types

    Posted Mon April 27, 2020 10:38 AM
    Hi All,

    When we deal with column's having "Number" Data Type with No precision and Zero scale ( Oracle DB ). Optim Extract job successfully extracting data only for <= 30 bytes. The data at 31st byte till 38 was changed as ZERO instead of source data as is.

    Column Def:
    NUMBER(*, 0) NOT NULL

    Data samples before and after Extract
    Sample Source Data > 30 Bytes  : 12345670551607123433666116255796
    Data in the Extract file > 30 Bytes: 12345670551607123433666110000000

    Sample Source Data > 30 Bytes  : 123456705516071234336661162
    Data in the Extract file > 30 Bytes: 123456705516071234336661100

    Sample Source Data <= 30 Bytes  : 123456705516071234
    Data in the Extract file <= 30 Bytes: 123456705516071234

    Our expectation is to just get the source data as is in the Extract file and mask couple of different column's in the same table --> and finally Load into the Target without any changes on above said column ( NUMBER Type ).

    Any one faced similar issue ? if Yes please advise us the solution.

    Thank you

    ------------------------------
    Harshavardhan Peddireddy
    ------------------------------

    #InfoSphereOptim
    #Optim


  • 2.  RE: Issue with column's having ORA_NUMBER Data types

    Posted Mon April 27, 2020 10:41 AM
    Interesting. Following. I assume that you have a ticket open with support as well right?

    ------------------------------
    Danny Lankford
    3M - IT Manager
    ------------------------------



  • 3.  RE: Issue with column's having ORA_NUMBER Data types

    Posted Mon April 27, 2020 12:00 PM
    Hi Danny,
    Yes, we have opened a case with IBM.​

    ------------------------------
    Harshavardhan Peddireddy
    ------------------------------



  • 4.  RE: Issue with column's having ORA_NUMBER Data types

    Posted Tue April 28, 2020 12:24 PM
    Hi - I created a temporary table mimicing your data and then created a process to extract/insert the data.  Once inserted into a new table, I did a table to table comparison.  The comparison gave all three rows as equal so is there something else you are doing in between the extract and the insert?

    drop user ovug cascade;
    create user ovug identified by passw0rd;
    grant dba to ovug;
    create table ovug.testtbl (keyval varchar2(5), numval number(*,0) not null);
    insert into ovug.testtbl (keyval, numval) values ('00001',12345670551607123433666116255796);
    insert into ovug.testtbl (keyval, numval) values ('00002',123456705516071234336661162);
    insert into ovug.testtbl (keyval, numval) values ('00003',123456705516071234);

    I am running Optim 11.3.0.7 and Oracle 12.2.

    ------------------------------
    JIMMY WILSON
    ------------------------------



  • 5.  RE: Issue with column's having ORA_NUMBER Data types

    Posted Tue April 28, 2020 03:02 PM
    Hi Jimmy,
    Thanks for your response.
    We are using Optim 11.3.0.6 version and Oracle 12.1.0.2.0​
    I we are just trying Extract and Insert.

    ------------------------------
    Harshavardhan Peddireddy
    ------------------------------



  • 6.  RE: Issue with column's having ORA_NUMBER Data types

    Posted Tue May 05, 2020 09:24 AM
    Edited by System Fri January 20, 2023 04:29 PM
    Hi All,

    FYI
    Optim Product Development team created an iFix for this issue ( Both 11.3.0.6 and 11.3.0.8 ). If any one of you face similar issue, get in touch with IBM support for iFix.

    RCA from Optim Development Team :

    This logic problem is actually a length control error in Optim code path that converts an Oracle number value into character display format.
    So, the Optim features to browse/edit the live table (XF/AF) and the generation of the sqlldr input files are all
    affected by this error since they all pass through the same code path to convert this number to character.
    iFix 021 corrects this on 11.3.0.8

    ------------------------------
    Harshavardhan Peddireddy
    ------------------------------