Database Access
|
|
|
Firewall Rules: Incoming HTTPS service on ODBC Data Connector configured port
For securing against the specific endpoints see: Egress IP addresses
|
|
|
|
|
|
|
There are many times where the data needed for a Planning Application comes from a database. This could be Organisation hierarchies for a sales planning application or last years actuals to form the basis of this years plan.
This data can, of course, be uploaded to Planning Analytics using flat files however, using a database directly provides several benefits:
- The data can be pulled on demand from the latest data without the latency and automation involved extracting to files, moving and loading.
- Altering a select statement can be quicker when developing vs altering a file extract.
- Using a live database connection also enables the ability to “drill to detail”. Often there is no need to plan at the lowest levels of detail such as sku. Planning at higher levels, such as product category, is often sufficient and creates a smaller and faster database . The “drill to detail” allows a user to select a cell in the planning application and see what it is made up from in the source system i.e. show me all the sku’s that make up this figure.
It is rare that an organization would be happy to completely open up their database to the internet for queries so a secure, encrypted method of accessing the database is required.
IBM Planning Analytics as a Service is provided with an ODBC Data Connector (ODBCIS) for this very purpose.
The ODBC Data Connector is configured on a machine within your environment and acts as a "translator" between ODBC calls to your database and ODATA used by Planning Analytics as a Service. When an SQL statement such as a "select" is passed to the ODBC Data Connector the connector uses local ODBC drivers to execute the statement on your database server. The results from this execution are converted to ODATA and passed back to Planning Analytics as a Service.
Documentation for the ODBC Data Connector can be found here.
The installation of the ODBC Data Connector is relatively simple however it does require infrastructure and possible firewall changes.
To prepare your infrastructure for the implementation of the ODBC Data Connector with Planning Analytics as a Service.
The ODBC Data Connector runs natively on windows and can be installed as a service.
Windows ODBC Data Connector hardware requirements:
You will need one (or more) machines (usually virtual) for the Windows ODBC Data Connector to be installed on.
Once installed, the ODBC Data Connector can serve multiple Planning Analytics as a Service databases.
The windows machine that has the connector installed on it also requires all ODBC drivers for all desired sources to be configured and working as it uses these connections to your target datasources.
Although not directly supported, I have had two ODBC Data connectors installed and used firewall load balancing across them for High Availability (HA). In my environment this used a WAF firewall rule to balance requests across the two installs. IF HA is desired I strongly recommend fully testing this scenario and using sticky connections. Feedback on this blog is always welcome as a means of sharing your experience.
Installation requirements for the machines:
The installation requirement for the ODBC Data Connector is:
- ODBC Data Connector for Windows supports Windows 10 and later or Windows Server 2016 and later.
- The Windows server where you run your Connector must have at least 4 cores and 4GB memory ABOVE that required for the OS and ODBC drivers.
- The Windows machine must have a network route to the required databases.
- An incoming connection for HTTP/S traffic is required
Networking (Firewall rules):
ODBC Data Connector needs a network route that is open to your internal/cloud databases and their respective ports (target data sources)
ODBC Data Connector needs an incoming internet connection on a port of your choice.
IBM Planning Analytics as a Service will only connect to the ODBC Data Connector using HTTPS with a valid certificate. Self Signed certificates are not trusted.
Note: The configuration of your firewall and certificates is different per client and depends on your own security rules and guidance.
A certificate can be added to the ODBC Data Connector itself or SSL termination can take place at the perimeter.
As an example, my configuration was:
A WAF Firewall rule for incoming connections and locked to only accept connections from the Planning Analytics as a Service egress IP addresses (Egress IP addresses).
The WAF Firewall rule had the Certificate for my domain installed on it and terminated the SSL connection there. Traffic from the WAF to the ODBC Data Connector was using HTTP and is inspected.
I also tested an alternative setup with an Nginx reverse proxy and the SSL certificate.
Windows ODBC Data Connector Installation and Configuration:
Once the machine that will host the ODBC Data Connector is provisioned in your network and the firewall is open you can start the Connector installation and configuration process.
Step 1, Download the ODBC Data Connector on your machine:
Step 2, Extract the ODBC Data Connector
Before extracting the ODBC Data Connector zip file, downloaded file are often blocked by Windows so it is advised to "right click" on the zip file, select properties and check the "Unblock" option followed by apply. |
|
You can now "Extract All" the contents of the installer zip file to a directory where you would like the ODBC Data Connector to run. The directory choice is up to you, in this case I extracted to C:\ODBCIS. |
|
Once extracted, find the "StartStandaloneODBCIS.bat" file in the extracted directory and run it in a command window. Then use crtl+c to quit the standalone ODBC Data Connector.
This step creates the relevant configuration files for the first time and we will edit these files in the next step.
|
|
Step 3, Configure the ODBC Data Connector
In the deployment folder under your extracted path your will see a config file called "config.odbcis.json". Open this up in an editor such as notepad.
|
|
The "config.odbcis.json" file contains three sections.
1, A Datasources section - This is for the ODBC data sources on the machine and contains connection parameters for each source. Create an entry here for each ODBC DSN you would like to use in Planning Analytics as a Service:
{ "Name": "The Name to use for PAaaS", "OdbcName": "The name of the ODBC DSN from your ODBC Manager", "Username": "The Database username", "Password": "The Database Password for the user", "PasswordEncrypted": Set this to false each time you enter a new password. }
2, A clients section - This contains a list of users, credentials and the datasources they have access to that is used in PAaaS Processes at the connection time.
{ "ID": "The username you would like to use in Planning Analytics for datasource access", "Secret": "The Password you would like to using in Planning Analytics processes to access these datasources", "SecretEncrypted": Set this to false each time you enter a new password, "Datasources": [ "The list of datasources this ID has access to", "Additional datasources etc." ] }
3, Additional parameters for tuning such as the SQLRowArraySize.
Note: When you next run the ODBC Data Connector, the passwords for clients and the datasources are encrypted and the "password/secret encrypted" flags are set to true. if you need to change these you will need to stop the ODBC Data Connector and edit them as well as set the relevant "encrypted" flag to false.
The use of clients means that you can have seperate datasource access defined for different groups such as HR and Finance. Each client can use the same or a different set of datasources. A Data source can be specified twice and named differently with a different database user for different datasource access permissions.
|
|
Under the installation directory in the deployment folder there is also a file called "config.internal.json". This file has various parameters that do not need changing normally, however, the "serverPort" parameter can be changed if you would like the ODBC Data Connector to run on a different port. |
|
At this point the ODBC Data Connector is configured and ready to be run.
It can be run standalone using the "StartStandaloneODBCIS.bat" in the installation directory or, alternatively, it can be added as a windows service using the "InstallODBCISWindowsService.bat" command file.
Start the service using your prefered method.
|
|
Optional: Using Postman to test the ODBC Data Connector
Before testing the ODBC Data Connector externally (and to rule out any firewall mis-configuration) you can use a tool like Postman to execute calls against the ODBC Data Connector.
Open postman and use "http://<Machinename>:5555/api/v1/DataSources" as the url changing the "<MachineName> to the name or IP address of the machine the connector is installed on.
Note: If you changed the serverPort in the config file your will also need to update the port used in the url.
Under the Authorization tab, enter the Client ID and Secret you used in the "config.odbcis.json" file.
Noe hit the blue Send button next to the url to execute the command
|
|
The ending for the url "/api/v1/DataSources" is a request for a list of the datasources available.
In my case, the two datasources and their driver information that are allowed for the clientID (user) I used are returned.
|
|
To test an SQL query, I use the "http://<Machinename>:5555" followed by "/api/v1/DataSources('mySQLDB')/odbcis.ExecuteQuery?$expand=Rows"
The 'mySQLDB' part of the url is selecting the database I wish to query.
In the postman "body" tab I placed the command I wish to run:
{ "Query": "<SQL Query>" }
In my case I used:
{ "Query": "select * from Film;" }
To select all data in the "Film" table.
After hitting the blue Send button, the data is returned from the Film table.
Note using the various body tabs such as "Pretty" and "Visualize" changes the view of the data and may be more readable.
|
|
The "README.md" file in the installation directory contains other urls that can be used.
I now also know that the ODBC Data Connector is working in my environment.
Note: once this is configured, you can use postman to test your public IP access to the data in the same way but using: HTTPS://<your public domain>:<PortNumber>/api/v1 as the connection.
|
|
Step 4, Add the ODBC Data Connector in Planning Analytics Workspace and test connectivity.
In order to use the ODBC Data Connector in Planning Analytics as a Service you need to add the connection to the Database.
The connection is defined for each Planning Analytics database and different ODBC Data Connector installations can be used by different databases if desired.
To configure the database to use the ODBC Data Connector you need to define the ODBCServiceRoot.
To do this:
In the Planning Analytics Workspace select the Administration tile and then the database tile. Select the database you wish to add the ODBC Data Connector connection to.
On the right side of the page select the Configuration tab and, under the Administration section, External databases from the drop down.
Here you can now add the external link to your ODBC data connector.
The format needs to start with HTTPS (self signed certs are not allowed).
You can choose one of two formats:
HTTPS://<ID>:<Secret>@<your public domain>:<PortNumber>/api/v1
or
HTTPS://<your public domain>:<PortNumber>/api/v1
Note: the first embeds the client id and secret from the "config.odbcis.json" file and will use this for all connections. The second format means that the developer will need to enter the client id and secret in the TI process before the database selection list is populated and means that different TI processes can use different credentials.
Documentation
|
|
After you have set up the ODBCServiceRoot you can now test the connection in a Planning Analytics process. Create a new process that uses a database connection.
If the client ID and secret are not embedded in the ODBCServiceRoot then enter them and select the database connection drop down. If the client ID and Secret are embedded in the ODBCServiceRoot you can simply select the database connection drop down.
|
|
You can now enter some SQL and test using the preview link. |
|
Using Text Files with ODBC Data Connector - Caution - The text file driver is provided by Microsoft and I have not fully tested its compliance with complex SQL. This section is added with a view that it may help simple file uploads as a file can be dropped into a local share on the ODBC Data Connector machine and read by the cloud.
Install the Microsoft Text Driver on the machine and create a DSN
|
|
Configure the DSN. In my case, I selected the desktop but this could be any accessible folder.
I did not manually specify metadata for individual files as I wanted to simply drop CSV files and have them work. I used all extensions and left the format as default so that the text driver could "guess" at how to handle the file.
|
|
Sample files of data were placed in my chosen directory. |
|
I was now able to create a process using the Text driver DSN.
The SQL was "Select * from <filename>"
the preview returned the data from the file.
|
|
Other useful links:
Thank you for reading and please do comment on this blog.