IBM Verify

IBM Verify

Join this online user group to communicate across Security product users and IBM experts by sharing advice and best practices with peers and staying up to date regarding product enhancements.

 View Only
Expand all | Collapse all

Performance improvements on AAC DB2 Cleanup Scripts on large AAC deployments

  • 1.  Performance improvements on AAC DB2 Cleanup Scripts on large AAC deployments

    Posted Thu April 10, 2025 09:48 PM
      |   view attached

    We have a large and very busy AAC deployment using DB2 as the HVDB to store entries in OAUTH20_TOKEN_CACHE and OAUTH20_TOKEN_EXTRA_ATTRIBUTE tables.

    What I have observed is during peak times our OAUTH20_TOKEN_EXTRA_ATTRIBUTE can get over 15M rows and OAUTH20_TOKEN_CACHE has close to 1M rows. When performing the cleanup even when we limit the number of rows to 2000 or 3000 the DB2 can be locked and create performance impacts on production response times.

    Using the manual cleanup steps documented here: OAuth token clean-up

    Ibm remove preview
    OAuth token clean-up
    When performing clean-up on the OAUTH20_EXTRA_ATTRIBUTE table, a STATE_ID is deleted when it is not found in the OAUTH20_TOKEN_CACHE table. Entries are deleted from the OAUTH20_TOKEN_CACHE table when they're expired.
    View this on Ibm >

    I see a major flaw in the approach where using a "WHERE NOT EXISTS" effectively does a join from OAUTH20_TOKEN_EXTRA_ATTRIBUTE back to OAUTH20_TOKEN_CACHE with many millions of rows.

    As a result of this DB2 is very slow and the response time sometimes on this can be upwards of 2 minutes to respond.

    Additionally OAUTH20_TOKEN_CACHE can have multiple rows which share the same STATE_ID as there are both access and refresh tokens that are stored in the TOKEN_ID and shared a STATE_ID, to then refer to the claims that are stored in OAUTH20_TOKEN_EXTRA_ATTRIBUTE.

    In my head what needs to happen and the SQL I have written does is.

    • Clean up the OAUTH20_TOKEN_CACHE until there is only one entry in there that has expired.
    • Once there is a single entry then create a new temporary table with expired STATE_ID values and do a delete in both OAUTH20_TOKEN_CACHE and OAUTH20_TOKEN_EXTRA_ATTRIBUTE based on the STATE_ID key.

    The first step is what has the most complexity as you don't want to remove all rows as you need to make sure at least one row is left. So I wrote some SQL to do the task and it seems to work well enough. Just wanted to know if this is a bad approach and if there was existing manual / custom clean up SQL anywhere else.

    But I think this is the correct way to go as then only expired records are removed, and duplicate entries that have an expired value in OAUTH20_TOKEN_CACHE is removed until there is only one expired entry left.



    ------------------------------
    Peter Lambrechtsen
    ------------------------------

    Attachment(s)

    txt
    DB2AACCleanup.sql.txt   3 KB 1 version


  • 2.  RE: Performance improvements on AAC DB2 Cleanup Scripts on large AAC deployments

    Posted Fri April 11, 2025 04:16 AM

    Hi

    there are multiple path to it

    1. Do cleanup during off production hours like from morning 2 to 4 am
    2. Incase a cleanup is done at frequent intervals, the put an offset of 15 mins to remove such entries 
    3. you can avoid a join , by just removing the entries based on stateID from both tables 
    4. later during off peak hrs check if any Extra attribute has a stateID that's not there in token cache and remove those orphan entries 
    5. db2pd, iostat, vmstat needs to be continuously collected , support will give details to understand of any issue related to cpu , disks,db locks


    ------------------------------
    Tushar
    Tushar
    ------------------------------