Platform

Platform

A place for Apptio product users to learn, connect, share and grow together.

 View Only
Expand all | Collapse all

Apptio Data Ingestion

Apptio Community Member

Apptio Community MemberFri February 09, 2018 12:05 PM

Jenny Franklin

Jenny FranklinSat February 03, 2018 10:18 AM

Apptio Community Member

Apptio Community MemberTue February 06, 2018 08:19 AM

  • 1.  Apptio Data Ingestion

    Posted Fri February 02, 2018 04:23 PM

    Hello TBM Connect community!  I am a new TBM member and have begun the process of trying to automate processes within our TBM office.  Luckily for me, my TBM team, @Travis Cole and @Keith Edwards, has begun the data ingestion automation by taking raw data from our many data sources and making modifications via the Apptio tool.  However, I want to take this a step further and have begun the process of automating the data transfer process from our internal systems and databases to the DataLink tool.  This process has taken me through quite a few TBM Connect discussions about DataLink but I have not found any discussions on how the community is automating the data transfer to the DataLink.  I understand the DataLink tool is great for anyone with externally facing infrastructure and for those users this may not be the right discussion for you as the DataLink tool has many integrations to different systems and databases.  However, for those like me that work in an environment where getting an external connection to a third party application can be like launching a team to mars, please join this conversation!  Below is my strategy for conducting this data transfer in an automated and secured fashion to Apptio from our internal environment and I would be very eager to hear how you're accomplishing this task.

     

    Automated Data Ingestion Plan:

    I have broken this connectivity into three 'connections', internal data source, internal SFTP server, and Apptio/DataLink.

     

    Step 1: Identify your data sources and pull data

    Types: SQL, Oracle, Applications with API calls

    Strategy: For my strategy I have decided to use a Windows R12 server and schedule tasks to use PowerShell via ODBC calls to the DB identified for pulling data via SQL queries or PowerShell API calls for some internal applications like Pivotal Cloud Foundry. 

     

    Step 2:  Send data to SFTP Server

    Type: Linux server used to house the data as it is sent from the data source to be securely picked up by DataLink connector.

    Strategy: For this I have decided to use PowerShell again to call Putty pscp.exe to securely transfer the data from our application server using key authentication to the SFTP server drop location.

     

    Step 3: Setup DataLink connector for Apptio ingestion

    Type: SFTP DataLink connector

    Strategy: For this I have setup using dual factor authentication to our SFTP server to meet our security requirements and separate connectors for each data set.

     

    Sample PowerShell Code:

    Step 1:

    ##Must use PowerShell from following file path, x86 version required for ODBC connectivity C:\Windows\SysWOW64\WindowsPowerShell and must have ODBC connection setup and the DSN name entered below prior to the execution of this automatic data pull##

    ###############################################################################
    # NAME: ODBC_Connection_Test.ps1
    # AUTHOR: Schneider, Robert
    # DATE: 01/29/2018
    #
    # This script uses ODBC to call datatables from databases.
    #
    # VERSION HISTORY:
    # 1.0 01/29/2018 Initial Version
    ###############################################################################


    #############################
    #Setting required variables #
    #############################
    $PSScriptRoot = Split-Path $MyInvocation.MyCommand.Path -Parent #set the current path of the script
    $pwd = "$PSScriptRoot\pwd.pwd" #setting path of encrypted password file
    $username = "$PSScriptRoot\username.pwd"

    if ((Test-Path $pwd) -eq $false -or (Test-Path $username) -eq $false){ #checking for encrypted username/password
    $credential = Get-Credential -Message "Please enter your username/password for your DB connection" #getting username/password for DB
    $credential.UserName|ConvertTo-SecureString -AsPlainText -Force|convertFrom-SecureString > "$username" #encrypting username

    $credential.password|convertFrom-SecureString > "$pwd"} #encrypting password

    $password = Get-Content "$pwd" #calling encrypted password
    $PlainTextPassword= [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR( (ConvertTo-SecureString $password) )) #decryption of password

    $username = Get-Content "$username" #calling username
    $user= [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR( (ConvertTo-SecureString $username) )) #decryption of username

    $date = get-date -Format MMM-yy #getting date in format for SQL query
    $upperdate = $date.ToUpper() #modifing the date to uppercase for SQL query
    $ConnectionString = "DSN=ODBCCONNECTIONNAME Test;UID=$user;Pwd=$PlainTextPassword" #connection string for ODBC connectivity
    $conn=new-object system.data.odbc.odbcconnection($ConnectionString) #creating connection object for ODBC

    #############################
    #SQL query set to a variable#
    #############################
    $sql ="
    SELECT TEST_DETAIL.POSTED_DATE,
    FROM TEST_DETAIL
    WHERE (((TEST_DETAIL.PERIOD)='$upperdate'));
    " #sample SQL statement

    #############################
    #Executing SQL CMD Via ODBC #
    #############################
    $conn.open() #open the ODBC connection
    $cmd = New-Object system.Data.Odbc.OdbcCommand($sql,$conn) #Initializes a new instance of the OdbcCommand class with the text of the query and an OdbcConnection object
    $da = New-Object system.Data.Odbc.OdbcDataAdapter($cmd) #data adapter object variable set for the ODBC command
    $dt = New-Object system.Data.datatable #setting the $dt variable as a .net datatable
    $null = $da.fill($dt) #Adds or refreshes rows in the DataSet $dt.(Inherited from DbDataAdapter.)
    $conn.close() #closing ODBC connection

    $dt |export-csv C:\users\$env:USERNAME\Desktop\test.csv -NoTypeInformation #exporting the data to a csv on desktop

     

    Step 2:

    #REQUIRES pscp.exe a Putty file transfer executable and enter the path in the script below#

    ###############################################################################
    # NAME: Apptio_PSCP_FileGateway.ps1
    # AUTHOR: Schneider, Robert
    # DATE: 01/23/2018
    #
    # This script transfers all files in the path "$PSScriptRoot\" to the SFTP server.
    #
    # VERSION HISTORY:
    # 1.0 01/30/2018 Initial Version (TEST)
    ###############################################################################
    $PSScriptRoot = Split-Path $MyInvocation.MyCommand.Path -Parent #set the current path of the script
    $PSEmailServer="smtpserver.com" #setting smtp server setting
    $filenames =Get-Item "$PSScriptRoot\Transfer\*" | Select-Object -ExpandProperty FullName #getting list of all file names in drop folder named Transfer
    $logfiledate = Get-Date -Format 'MMddyyyyss'

    foreach ($filename in $filenames){ #creating foreach loop to strip filename and path into SSH PSCP (PSCP is a Putty program for transferring files via SSH SFTP)

    $check = $null

    $plainname= [System.IO.Path]::GetFileName("$filename") #.NET protocol for striping path from filename

    $plainname_mod="APPDATFL." + [System.IO.Path]::GetFileName("$filename") #.NET protocol for striping path from filename and adding protocol needed for FileGateway

    if((Get-Item $filename).length -gt 0kb -ne $false){ #checking for file content to ensure not transferring empty file
    & "filepath_to\pscp.exe" -P 6522 -i "\\SOURCEKEY.ppk" $filename "USERNAME@SERVER.com:/FILEPATH/$plainname_mod" >> "$PSScriptRoot\DataLink_Transfer_Logs\log_$logfiledate.txt"}

     

    I have excluded my logging and removal of files once they are transferred.  If you would like that portion of the PowerShell let me know and I would be happy to share.  Look forward to your discussion!





    #Datalink


  • 2.  Re: Apptio Data Ingestion

    Posted Fri February 02, 2018 08:49 PM

    Wow Robert, thanks so much for taking the time to put this together! I have marked it as a featured article so it gets extra exposure on our DL home page.


    #Datalink


  • 3.  Re: Apptio Data Ingestion

    Posted Fri February 09, 2018 12:05 PM

    Thanks Ken, glad to share!


    #Datalink


  • 4.  Re: Apptio Data Ingestion

    Posted Sat February 03, 2018 10:18 AM

    This is amazing!!


    #Datalink


  • 5.  Re: Apptio Data Ingestion

    Posted Tue February 06, 2018 08:19 AM

    Thanks for sharing your thought Rob. This was helpful.


    #Datalink