Envizi

 View Only

Integrating Turbo with Envizi via AppConnect for Green IT data

By Jeya Gandhi Rajan M posted Thu March 23, 2023 12:41 AM

  

This blog explains about the step-by-step instructions to pull green IT data from Turbonomic into Envizi via App Connect.

Authors

Jeya Gandhi Rajan M, Mamatha Venkatesh

Contents

1. Prerequisite

  • Turbonomic v8.6.6 or higher
  • IBM App Connect SaaS or App Connect Enterprise on-prem
  • Envizi's S3 bucket

2. Architecture

Here is the architecture that describes about this Turbo and Envizi integration.

App Connect flow pulls the list of Cloud Regions and On-prem Data Centers from Turbo and sends it to Envizi's S3 bucket in a CSV file. This CSV file will be further processed by the Envizi internally.

3. Turbonomic Configuration

Mandatory Configuration

  • Create a user with Observer role in Turbo. AppConnect needs this user to fetch data from Turbo.

Optional Configuration

  • Add the following Tag in vCenter and add their values as tags to the Data Centers for accurate emission calculations from Envizi:

    • Country: Name of Country
    • Latitude: Latitude in Decimal Degrees format
    • Longitude: Longitude in Decimal Degrees format
  • By default, Envizi will use the Data Center name configured in Turbonomic/vCenter. To change this, Tag Category EnviziAlternateName can be added with the desired display name as its value.

  • Envizi Locations (Data Centers in this case) need unique display names. If there are any Data Centers with same names, they should be changed from vCenter or Tag Category EnviziAlternateName should be added to the Data Center(s) with different name(s)

Note: Tags sync from vCenter to Turbonomic might take upto 20 minutes.

4. Envizi's S3 bucket

Envizi product team would have created and shared S3 bucket. This S3 bucket details to be feed into the AppConnect flow.

The App Connect flow pulls the data from Turbo and sends it to S3 bucket in a CSV file format. Envizi will further process this CSV file.

5. App Connect Configuration

5.1. Create Connectors

Need to create Amazon S3 connector and 2 HTTP connectors.

5.1.1. Create Amazon S3 Connector

Need to create Amazon S3 connector.

  1. Click on Catalog > Amazon S3 > Add a New Account

  1. Fill in the S3 details as below. (Credentials should have been given by Envizi)
Secret access key : Moxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Access key ID : AKXXXXXXXXXXXXXXXXXX

  1. The new connector account (ex: Account 2) might have been created.

5.1.2. Create Http Connector 1

  1. Type http in the filter

  2. Click on Http Connector > Add a New Account

  1. Fill in details as below.
  • User Name : username (as it is)
  • API Key : Enter the Turbonomic credentials in this format USERNAME&password=PASSWORD. Example: If the username is test-user and the password is test-password, enter test-user&password=test-password
  • API Key Location : body URL encoded

  1. Click on Connect button.

  2. The new connector account (ex: Account 3) might have been created.

5.1.3. Create Http Connector 2

  1. Click on Http Connector > Add a New Account

  1. Leave all the fields as it is.

  1. Click on Connect button.

  2. The new connector account (ex: Account 4) might have been created.

5.2. Import the Flows - Locations

Need to import the flow and configure connector, variables and schedule.

5.2.1. Import the flow

  1. Click on Dashboard > Import Flow

  1. Download the flow file TurbonomicLocations.yaml . The original version of the file is available here

  2. Drag and Drop the same file into the given place.

  3. Click on Import

  1. The flow should have been created.

5.2.2 Set Http Connector 1

  1. Click on HTTP node with the label Invoke Method. The details are displayed in the bottom.

  1. Select the first http account (Account 3) created above,

5.2.3 Set Http Connector 2

  1. Click on HTTP node with the label Invoke Method 2 and the details are displayed in the bottom.

  1. Select the second http account (Account 4).

5.2.4. Set Http Connector 3

  1. Click on HTTP node with the label Invoke Method 3 and select the second http account (Account 4) for this also.

5.2.5. Set S3 Connector

  1. Here is the sample S3 bucket name called envizi-data-load created and available. This should have been given by Envizi.

  1. Click on Amazon S3 node with the label Create Object and the details are displayed in the bottom.

  1. Enter the S3 bucket name (ex: envizi-data-load).

5.2.6. Set Variables

  1. Click on Set Variable node and the details are displayed in the bottom.
  • URL : The Turbonomic url
  • Customer : Here 'My-Turbo-Locations' is given as an example.

5.2.7. Set Scheduler

  1. Click on Scheduler node and the details are displayed in the bottom.

  2. The flow can be configured to run as per our need. Here it is configured to run every day at 00:05 hours.

  3. The checkbox Also run the flow, when it's first switched on to be on, if you want to run the flow immediately after you start the flow.

5.2.8. Dashboard

The flow is created and available.

5.3. Import the Flows - Accounts

5.3.1. Import the flow

  1. Click on Dashboard > Import Flow

  2. Download the flow file TurbonomicAccounts.yaml. The original version of the file is available here

  3. Drag and Drop the same file into the given place.

  4. Click on Import

  1. The flow should have been created.

5.3.2. Set Http Connector 1

  1. Click on HTTP node with the label Invoke Method and the details are displayed in the bottom.

  1. Select the first http account (Account 3) created above.

5.3.3 Set Http Connector 2, 3 and 4

  1. Click on HTTP nodes with the labels Invoke Method 3 , Invoke Method 2 and Invoke Method 5 and the details are displayed in the bottom

  2. Select the Second http account (Account 4) created above.

5.3.4. Set S3 connector

  1. Click on Amazon S3 node with the label Create Object and the details are displayed in the bottom.

  2. Enter the S3 bucket name (ex: envizi-data-load).

5.3.5. Set Variables

  1. Click on Set Variable node and the details are displayed in the bottom.
  • URL : The Turbonomic url
  • Customer : Here 'My-Turbo-Accounts' is given as an example.
  • OverrideStartDate : Start date of the period for which the details are required from Turbo
  • OverrideEndDate : End date of the period for which the details are required from Turbo. This could be the current date.

5.3.6. Set Scheduler

  1. Click on Scheduler node and the details are displayed in the bottom.

  2. The flow can be configured to run as per our need. Here it is configured to run every day at 00:15 hours.

  3. The checkbox Also run the flow, when it's first switched on to be on, if you want to run the flow immediately after you start the flow.

5.3.7. Dashboard

The flow is created and available.

6. AppConnect flow Execution

6.1. Start the flow

  1. Right click on the top of the flow tile and start the flow.

6.2. Data in S3

The flows will pull the data from the Turbo and push it to S3. You can see the output like this in S3.

6.3. Sample Data from S3

The sample data is available here. Accounts, Locations.

6.4. Processing S3 files in Envizi

The Envizi automatically pull the data from S3 and process it and creates/update the Turbonomic Performance Dashboard as below.

Reference

Turbonomic - Envizi Integration https://ibm.github.io/IBM-Sustainability-Software-Portfolio-Connectors/turbonomic-envizi/

Turbonomic - Envizi Integration https://github.com/IBM/turbonomic-envizi-appconnect-flows

Appendix

Refer the following.

Envizi - Turbonomic Performance Dashboard

This document describes about the Turbonomic Performance Dashboard available in Envizi.

https://community.ibm.com/community/user/envirintel/blogs/jeya-gandhi-rajan-m1/2023/04/01/turbonomic-performance-dashboard-in-envizi



#envizi #Sustainability #turbonomic

#ESG Data and Environmental Intelligence #sustainability-highlights-home

Permalink

Comments

Thu March 30, 2023 09:44 AM

@Roberto Battistoni  you can see the below info.
 
From On-Prem data center (VMware vCenter)
  • Energy Consumption at Host and Data Center level
  • Emissions (t CO2e) at Host and Data Center level
From Public Cloud
  • Energy Consumption at VM level
As this point of time service/application level info not available. 

Thu March 30, 2023 05:22 AM

Awesome Geya thank you.  This integration would allow me to measure emissions at a data centre level.  What is the granularity of the insight it could offer eg Cloud vs On perm, what else

What would I need to measure at service/application level?