Joost,
Are you CLASSIFYING the input utterances? If so, then you probably want to use the classification and training in Watson Assistant. If you are just looking to identify concepts, sentiment, and some other things like that, then use NLU.
------------------------------
Daniel Toczala
Community Leader and Customer Success Manager - Watson
dtoczala@us.ibm.com------------------------------
Original Message:
Sent: Tue November 23, 2021 07:25 AM
From: Joost Vos
Subject: Migrating from Natural Language Classifier to Natural Language Understanding
Hi All,
I am working on the migration of a classifier built with IBM's natural language classifier (NLC) service. As the service has received the label "deprecated" we need to migrate our models to the Natural Language Understanding service. The migration itself seems quite straightforward. What I actually couldn't find is the description of the API request payload details. With the NLC API, it is possible to analyze 30 sentences at the same time in a single API call. Our Python backend splits the number of sentences to be classified into chunks of 30 sentences. Each chunck of 30 sentences is added as payload to the NLC API call. Very convenient with regard to analysis speed. I was wondering whether it is also possible to feed chunks of multiple sentences into the payload of a NLU API call. Perhaps anyone has already experience with this! I am curious to learn whether this is possible!
Best regards,
Joost
------------------------------
Joost Vos
------------------------------
#EmbeddableAI
#WatsonAPIs