IBM RPA allows you to automate database operations, ranging from simple commands to complex integrations.
Often, complex operations are a combination of simple actions. So, before jumping into complex examples (which may be a part of an upcoming post), we will focus on the essential commands for database operations using IBM RPA.
IBM RPA can interact with Microsoft SQL Server, Oracle, PostgreSQL, MySQL and SQLite. You can also interact with databases using the ODBC driver, if you have the respective driver installed, you will be able to interact with them.
Connecting
The first step is to connect to the database you need. In the Toolbox, there is a Database category, and from there, you can find Connection. In this category, you will find connect commands for each of the supported drivers.

For all databases, the connect command will require the appropriate connection string. (Hint: If you have trouble creating your connection string, there are websites for generating that). As Output from this command, you can get a Connection object. This variable will be used in all commands that directly interact with databases, so it will be needed as input for later commands.
The Database Connection variable type is not specific to any database, so later we can use it in WAL commands for any database.

Running SQL Commands
Now that we have the connection defined, we have different sets of commands that we can perform:
- Run SQL Command
- Run SQL Reader
- Run SQL Scalar
- SQL Procedure
- Run SQL Procedure Reader
We will focus on Run SQL Reader (sqlExecuteReader
) for now. To use this command, we need to specify the Connection, as well as the Statement (SQL query). WAL language is flexible, so you can write the statement in different ways:
- You can specify the query straight into the command
- You can use variables for assigning values dynamically
- You can even use a variable and dynamically generate the statement
As an output for this command, we can get a Data Table. This table represents the result of your SQL query. Rows and Columns return the respective attributes from the query result.

You can use Data Table in many ways:
- You can print values using Log Message or even see the values in the Globals tab.
- Use it as input for importing data into other commands
- Write results to a csv file (Write Table to a File)
- Perform calculations over it (check out previous blog post regarding expression parameter for Data Table)
- Check Base -> Data Table in the Toolbox, there are many possibilities here.
Run SQL Command and Run SQL Scalar are similar commands to what we just ran.
Run SQL Command will run the SQL query, but without generating an output (Data Table). This is useful for DDL commands (Data Definition Language) such as CREATE TABLE, as well as DML (data manipulation language) commands that don't generate data results (like update or delete). WAL will already provide the affected rows as output.
Run SQL Scalar returns the resulting value of the SQL query, which means the output must contain only one row and one value. This is useful for running system functions or getting environment values. In the example below, we use @@VERSION, which is a Microsoft SQL Server system variable, representing the RDBMS installed version. WAL also provides a Has Values output, so you can safely confirm whether the command generated output or not.

Running Stored Procedures
Now, let's move to stored procedures. We have two commands specific for this: Run SQL Procedure Reader and SQL Procedure. Like the previous category, the first command will generate a Data Table for its results, while the latter will get only the number of resulting rows.
In the example below, we will execute the stored procedure named sp_CreateInvoice. We assign the required values to the Parameters, following the same parameter names as the existing procedure in this database. In this example, we pass the values directly, but you can use everything WAL provides:
- You can use variables
- You can get values from the script parameters
- You can get values from your vault and then assign them to your variables
- There are other options, feel free to experiment!

Combining Commands
Exporting Data
Now that we covered the basic commands, we can combine them with other commands to perform more elaborate actions. In this example, we will run a SQL query, read the contents and export it to a .csv/xls file. Replace the connection string with a valid one, and select a valid output path, and you will have a command for exporting data.
defVar --name dbConn --type DbConnection
defVar --name result --type DataTable
defVar --name tableName --type String --value MyTableName
defVar --name value --type String
defVar --name hasResults --type Boolean
defVar --name spResult --type DataTable
sqlServerConnect --connectionstring "<add_your_connection_string>" dbConn=connection
sqlExecuteReader --connection ${dbConn} --statement "select *\r\nfrom ${tableName}" result=value
writeTableToFile --addHeaders --delimiter "ByCharacters" --characters ";" --path "C:\\temp\\output.csv" --dataTable ${result}
Importing Data
We can import data by combining the Read CSV File (readCSV
) command with Bulk Insert (sqlBulkInsert
). The command readCSV
will read a file and generate a Data Table as output. Using this output, you can run command Bulk Insert command, and import this data table into your database.
Replace the connection string with a valid one, as well as select a valid output path, and you have a command for importing data from a csv file.
defVar --name dbConn --type DbConnection
defVar --name result --type DataTable
defVar --name tableName --type String --value Customers
defVar --name value --type String
defVar --name hasResults --type Boolean
defVar --name spResult --type DataTable
readCSV --filepath "C:\\temp\\output.csv" --delimiter ";" --hasheaders --missingfieldaction "ParseError" result=value
logMessage --message "${result}" --type "Info"
sqlServerConnect --connectionstring "<your_connection_string>" dbConn=connection
sqlBulkInsert --dataTable ${result} --connection ${dbConn} --tablename "${tableName}" --mappings "[]"
Next Steps
The existing WAL commands are very flexible, as you can use the same commands for different databases (if you provide a valid SQL query for the target database). You can combine commands for interacting between different servers, just by assigning a different connection.
The Bulk Operations are a very powerful set of commands, as you can import and modify an entire dataset in your database.
Read CSV File is another useful command, as you can read existing files and convert to a Data Table. Try combining it with Web Browser automation (or even API calls) to download csv files from your suppliers.
Data Table is a powerful and flexible data type in WAL. You can use other related commands for handling this data.
Try combining these commands (and others) and you will be able to automate processes.