Hi everyone, I have been banging my head against this the last few days and thought I would ask the group in case I am missing something obvious. Essentially, I am using IBM RPA to integrate with watsonx.ai and I am able to successfully retrieve a token via an HTTP Request but when I go to post the JSON to run a model, I am not getting back a response. Below is my configuration. I am passing 2 headers - one for the Content-Type and one for the Authorization. I know this works because I get a "missing token" error if I don't include those. I have tried with Json and Text formatters, both with the same result.
Here is the output in the debugger:
Here is the full URL since it is cut off: https://us-south.ml.cloud.ibm.com/ml/v1/text/generation?version=2023-05-29
I have tried this with a ChatGPT endpoint as well with the same result so I must be doing something wrong. Below is the JSON string I am using which is the ${json} value in the debugger (redacting the project_id but it is valid too when I run it in SOAP UI:
{"input":"Input: Provide 5 examples of a business process for a large insurance company Output:","parameters":{"decoding_method":"greedy","max_new_tokens":2000,"min_new_tokens":0,"stop_sequences":[],"repetition_penalty":1},"model_id":"meta-llama/llama-3-70b-instruct","project_id":"xxx"}
Again, this works fine using SOAP UI and even a BAW service call. Thanks in advance for any advice!
Regards,
Jared
------------------------------
Jared Michalec
------------------------------