Hi Alexey
First of all, be aware that Resilient has limits regarding the number of interpreted python lines in a single execution unit and limits regarding the number of objects you can create in a single workflow.
Let's suppose your REST call returns to you a structure, let's say, a JSON list of 100 items, which is something Resilient probably can deal with. (500 probably won't work)
There are two ways to address this.
[1] One way is to create a 2nd workflow using datatable scope that will be triggered by an event, let's say, a datatable row creation.
In this case, you have to create a datatable and a rule to trigger this 2nd workflow when a new row is created.
For example, you have a datatable called dt with a single column called c1.
In your REST post-processing, you can do something like
for x in range(10):
new_row = incident.addRow('dt')
new_row['c1'] = 'some value here'
And in your 2nd workflow, since it's datatable scope, there is a row object available and you can get the data like this
col1 = row.data['c1']
You can think of this like a "poor man's fork".
The pro with this approach is that it runs "recursively", so it's way fast.
The con is that there's no "join" for this "fork". So it works if your workflow logic does not need to wait for each 2nd workflow instance to finish in order to proceed.
[2] you can use a workflow variable and a decision component to implement a loop
Here, there's a caveat. Resilient has a very simple cycle detector, to avoid being caught in infinite loops. So, if your loop is too simple (for example, your workflow loop only contains scripts), Resilient may simply abort your loop, returning a message like
"The requested operation resulted in a cycle in the workflow 'Loop'. To prevent an infinite loop, the workflow was terminated. Please have your administrator check the workflow for possible cycles."
So, this does not work
You can avoid this message adding a REST f(x) call inside your workflow loop, for example.
Resilient allows you to create workflow variables since they are dicts, so this script is valid.
iterator = {}
iterator['value'] = 10
workflow.addProperty('iterator', iterator)
Then you can create your loop adding another script to decrement
iterator_value = workflow.properties['iterator']['value']
iterator_value = iterator_value - 1
iterator = {}
iterator['value'] = iterator_value
workflow.addProperty('iterator', iterator)
incident.addNote(str(iterator_value))
And your condition may be
workflow.properties['iterator']['value'] > 0
I hope it helps.
------------------------------
[]
Leonardo Kenji Shikida
------------------------------
Original Message:
Sent: Fri May 13, 2022 02:58 AM
From: Alexey Fedorov
Subject: fn_utilities and a list of input parameters
Hello,
I have a playbook and I need to perform few Call REST API. I use fn_utilities for this purpose.
I use result of a Call REST API from previous step as an input parameter for my next Call REST API.
I faced with an issue when I have a list in result and I don't understand how I can use it in a playbook to call all items from a list.
I tried this construction:
for x in result:
input.rest_method = "POST"
...
input.rest_body = x
The functiuon works only one time with first list entry.
I can't use: input.rest_body = result["x"][0], because I don't know how many enties in a list.
I think I need to Call REST API a few times in cirlce, but how to do it and not get same result every call?
------------------------------
Alexey Fedorov
------------------------------