Expand all | Collapse all

backup tables

  • 1.  backup tables

    Posted Fri February 19, 2021 02:10 AM
    Hello All,

        Kindly let me know , I have to drop 220 tables from production environment and each table around 1 million rows (300000).
        These table dropping is part of some activity so that day Backup will not run.

        Concern database has around 5K tables.

         So what type of backup should I take?

         Will Unload work ere?

    Amit Patel


  • 2.  RE: backup tables

    Posted Fri February 19, 2021 03:15 AM

    Hi Amit,

    I would prefer inserting the existing tables contents into external tables which are the image of the existing tables.

    Unload is slow

    or HPL but much more complex to activate

    [eric] [Vercelletto] []
    [Pont l'Abbé] [France]
    [+33 626 52 50 68]

    Disclaimer: My own opinions are my own opinions and do not reflect on the IIUG, nor any other organization with which I am associated either explicitly, implicitly, or by inference. Neither do those opinions reflect those of other individuals affiliated with any entity with which I am affiliated nor those of the entities themselves.

  • 3.  RE: backup tables

    Posted Fri February 19, 2021 03:35 AM
    Dear Eric,

         Thanks for Reply.

      As per example below, Do I need to provide the same columns of existing table ?
    And below example will insert data in new external table and also it will create .unl file for new external table?    Am I correct?

    Kindly revert.

     ( name CHAR(18) EXTERNAL CHAR(18),
       address VARCHAR(40) EXTERNAL CHAR(40),
       empno INTEGER EXTERNAL CHAR(6) 
    USING (
    INSERT INTO employee SELECT * FROM emp_ext;


  • 4.  RE: backup tables

    Posted Fri February 19, 2021 04:12 AM

    Hi again Amit,


    Best it you look at the doc and these examples which are very good for what you want to do.







    Eric Vercelletto
    Data Management Architect and Owner / Begooden IT Consulting
    KandooERP Founder and Community Manager
    IBM Champion 2013,2014,2015,2016,2017,2018,2019,2020

    Tel:     +33(0) 298 51 3210
    Mob : +33(0)626 52 50 68
    skype: begooden-it
    Google Hangout:
    www :



  • 5.  RE: backup tables

    Posted Fri February 19, 2021 04:36 AM
    I'd recommend looking at Using the SAMEAS Clause.

    Andreas Legner

  • 6.  RE: backup tables

    Posted Fri February 19, 2021 07:51 AM

    If you just want an historical archive of the 220 dropped tables, you should, as you suggest, unload their contents into flat files and copy them to external media for safety. I like Eric's suggestion to use external tables. They are MUCH faster than a dbaccess UNLOAD. If you want, myschema can create the unload scripts for you:

    myschema -d mydatabase --myexport-scripts -t 'matches_pattern_for_tables_to_process*' /dev/null

    That will produce two files for each table, databasename.tablename.myexport.sql and databasename.tablename.myimport.sql

    The *.myexport.sql script will create an external table for the table, unload the data to a file named with the first five letters of the tablename appended with the tabid as a five digit number. The filepath will be the current path where you were when you ran myschema, so if you have a specific location where you want the files written, you will either have to cd to that location before running myschema, edit the scripts after they are created, or move the files once you export the data. Obviously the first option is the easiest.

    Art S. Kagel, President and Principal Consultant
    ASK Database Management Corp.