InfoSphere Optim

InfoSphere Optim

Connect with Db2, Informix, Netezza, open source, and other data experts to gain value from your data, share insights, and solve problems.

 View Only
Expand all | Collapse all

Optim Data Growth "Archive File Data Byte Count" calculation.

  • 1.  Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Thu February 28, 2019 01:56 PM
    I am in the process of estimating our next 2 years of Optim Data Growth volume license usage for Native Optim Connectors.... "relational databases". 

    During our validating testing of Optim 11.3. FP7, I noticed some unexpected results in the "Archive File Data Byte Count" that is written to the Archive Process Report. The value that is written to the... 
    • PSTARCVFILE*, 
    • PSTARCVTBL*, 
    • PSTOBJ* tables.
    • Or better yet, the xml content in the OPTIM_TRANSACTION_CONTENT table. 
    Has anyone come across any documentation or have experience on how this calculation works? i.e. compressed or uncompressed data sources. Why the negative values in the results? I am unable to reliably use the information in these tables to determine or predict how much licencing I should purchase. 

    It appears as though FP7 addressed the OPTIM_DATA_CONSUMPTION_VIEW calculation issues but I'm still seeing very large differences between the actual source size of the database objects and what Optim calculates or writes to the Process Report? 

    P.S. 
    Yes I have a ticket open for this. 


    ------------------------------
    Danny Lankford
    Advanced IT Specialist
    3M
    St Paul MN
    ------------------------------

    #InfoSphereOptim
    #Optim


  • 2.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Fri March 01, 2019 04:13 AM

    It is a two's complement overflow, and I use this SQL snippet to get valid values. I get the value, a MB value and the real number in case there is some confusion.

     

        CASE

            WHEN to_number(t.data_byte_count) < 0 THEN

                power(2, 32) + to_number(t.data_byte_count) + 1

            ELSE

                to_number(t.data_byte_count)

        END data_byte_count,

        CASE

            WHEN t.data_byte_count < 0 THEN

                round((power(2, 32) + t.data_byte_count + 1) / power(1024, 2), 4)

            ELSE

                round(t.data_byte_count / power(1024, 2), 4)

        END data_byte_count_mb,

        t.data_byte_count          AS data_byte_count_neg,

     

     

    Thanks,

    Ed Lipson

    Director

    BNY Mellon | 2 Hanson Pl AIM 111-0800 | Brooklyn NY 11217

    Information Lifecycle Management

    T 718.315.4763 |  F 724.540.6622 | C 917.859.5180 | ed.lipson@bnymellon.com

     


    The information contained in this e-mail, and any attachment, is confidential and is intended solely for the use of the intended recipient. Access, copying or re-use of the e-mail or any attachment, or any information contained therein, by any other person is not authorized. If you are not the intended recipient please return the e-mail to the sender and delete it from your computer. Although we attempt to sweep e-mail and attachments for viruses, we do not guarantee that either are virus-free and accept no liability for any damage sustained as a result of viruses.

    Please refer to https://disclaimer.bnymellon.com/eu.htm for certain disclosures relating to European legal entities. We take our data protection and privacy responsibilities seriously and our privacy notice explains how we collect, use and share personal information in the course of our business activities. It can be accessed at the privacy section of www.bnymellon.com.





  • 3.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Fri March 01, 2019 03:52 PM
    Hey Ed, 

    Thanks for the reply. I use a similar query as well but I think my bigger questions/comments to the forum are... 
    1. Why are we getting the negative values in the DATA_BYTE_COUNT results?
    2. Is Optim calculating the source database objects based on the "source" compressed size or "source" uncompressed size? 


    For example, I've run into a DB2 database that has compressed table spaces. The compression on the source tables (or table spaces) can be as much as 80% depending on the data_types therein. So do I pay for the compressed size or the "Optim uncompressed calculation"? 

    Support sent me this explanation. It would appear as though it's the source uncompressed calculation. 



    ------------------------------
    Danny Lankford
    Advanced IT Specialist
    3M
    St Paul MN
    ------------------------------



  • 4.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Mon March 04, 2019 04:23 AM

    To know what to pay you should talk to your IBM contracts people, each is different.

     

     

     

    Thanks,

    Ed Lipson

    Director

    BNY Mellon | 2 Hanson Pl AIM 111-0800 | Brooklyn NY 11217

    Information Lifecycle Management

    T 718.315.4763 |  F 724.540.6622 | C 917.859.5180 | ed.lipson@bnymellon.com

     


    The information contained in this e-mail, and any attachment, is confidential and is intended solely for the use of the intended recipient. Access, copying or re-use of the e-mail or any attachment, or any information contained therein, by any other person is not authorized. If you are not the intended recipient please return the e-mail to the sender and delete it from your computer. Although we attempt to sweep e-mail and attachments for viruses, we do not guarantee that either are virus-free and accept no liability for any damage sustained as a result of viruses.

    Please refer to https://disclaimer.bnymellon.com/eu.htm for certain disclosures relating to European legal entities. We take our data protection and privacy responsibilities seriously and our privacy notice explains how we collect, use and share personal information in the course of our business activities. It can be accessed at the privacy section of www.bnymellon.com.





  • 5.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Wed March 06, 2019 01:04 PM
    Hi,
    First thanks for pointing out the negative value on the "archive byte count".  I've asked development to look into it.  Hopefully it is only a conversion error while creating the Optim report.​

    Second, on the question of how the bytes are counted, yes, the data byte count is the uncompressed volume of user data.  The link to the web page gives more detail on how much space each data type takes.

    Peter

    ------------------------------
    Peter Costigan
    Offering Manager, Optim Solutions
    IBM
    San Jose CA
    ------------------------------



  • 6.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Mon March 11, 2019 01:14 AM
    Hi Dann,

    I am trying to understand the problem more clearly. May I know whether you are looking at some individual records in optim_data_consumption_view having data_byte_count < 0 values? In any of my test cases gives me any such rows and I have tried with Oracle, DB2, SQL Server.

    Basically I am looking some steps to reproduce the problem.

    Thanks,
    Tulasi

    ------------------------------
    Tulasi Uppu
    ------------------------------



  • 7.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Mon March 11, 2019 01:17 AM
    Sorry for the typo in your name, I mean Danny instead of Dann.

    ------------------------------
    Tulasi Uppu
    ------------------------------



  • 8.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Mon March 11, 2019 09:01 AM
    Tulasi,

    Read the entire thread post. This was a known issue that was corrected in FP7 according to Support. See the FixPack 7 fixlist pdf. 

    There were two questions in the post.  Peter answered one question. FP7 fixlist answered the other one. 

    Thanks, 







    ------------------------------
    Danny Lankford
    Advanced IT Specialist
    3M
    St Paul MN
    ------------------------------



  • 9.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Wed March 13, 2019 02:42 PM
    ​Hi Danny,

    I ran into the same problem with Optim 11.3 fp3, and after upgrading to 11.3 fp6a the new archives come up OK on the data consumption view, with positive values.

    I still have lots of archives that were executed while on fixpack 3 that are still showing negative values even after the update to fixpack 6, and rerunning the archive is not an option since the original data source is already demised... at the end of the day I have no means to tell IBM how much data was archived for license purposes. Are you also running into this problem?
    I have a PMR is open for while now without resolution.

    ------------------------------
    Elder Lira
    ------------------------------



  • 10.  RE: Optim Data Growth "Archive File Data Byte Count" calculation.

    Posted Wed March 13, 2019 03:01 PM
    Elder, 

    Yes, I'm still seeing negative values for "data_byte_size" or "checksum" in certain tables even after FP7. I also have a ticket open for this. 

    The OPTIM_AUDIT_RECORDS table in conjunction with the OPTIM_TRANSACTION_CONTENT table make up the growth view. The OPTIM_AUDIT_RECORDS table does not have negative values. 

    Some of the other tables do still have negative values in them even after FP7. 

    For example. I don't think those CHECKSUM values in the PST* tables are supposed to be negative. 



    ------------------------------
    Danny Lankford
    Advanced IT Specialist
    3M
    St Paul MN
    ------------------------------