watsonx Assistant

 View Only

Beta Release of Conversational Search for IBM watsonx Assistant Powered by IBM LLM model Granite

By MARIA MOREL posted Wed October 11, 2023 07:50 PM


Today, customers expect instant and more gratifying ways to engage with your business. Your customers don’t want to interact with unhelpful chatbots or be transferred to automated systems that can take up their valuable time. They want fast and accurate answers to questions and effortless support experiences. Organizations face immense pressure to modernize their infrastructure and improve digital customer experiences by introducing emerging generative AI technology. But the stakes are high. Integrating Large Language Models and generative AI to customer-facing support environments can be incredibly complex and risky without the right expertise, tools, and strategy in place.

At IBM we understand what it takes to deliver outcome-centric experiences that scale across your business. For years, IBM has included Large Language Models in its market-leading conversational AI platform, IBM watsonx Assistant, to help organizations build more intuitive AI Assistants that deliver seamless, hyper-personalized experiences to their customers. Now with advancements in generative AI technology, we see new opportunities to introduce innovative features across new and exciting customer care use cases.

Generative AI is changing the game

Businesses have explored the benefits of Enterprise Search to support conversational AI powered experiences in customer service for some time now. The ability to search and extract data from business content to power AI Assistants has proved to be quite successful as a traditional fallback option to address customers’ unexpected questions: from fulfilling answers beyond the scope of an AI Assistant to empowering customer service agents to address complex customer queries.

New Large Language models tailored for enterprise use cases are making it much easier to scale AI Assistants for business, improving the quantity of topics and quality of answers your AI Assistant can cover with no training. 

Introducing Conversational Search for watsonx Assistant  

We are excited to announce the beta release of Conversational Search in watsonx Assistant. Powered by IBM LLM model Granite and enterprise search engine Watson Discovery, Conversational Search is designed to scale conversational answers grounded in business content so your AI Assistants can deliver faster, more accurate answers to your customers.

Last month, IBM announced the General Availability of Granite, IBM Research´s latest Foundation model series designed to accelerate the adoption of generative AI into business applications and workflows with trust and transparency. Now, with this beta release, users can leverage a Granite LLM model pre-trained on enterprise-specialized datasets and apply it to Enterprise Search within watsonx Assistant. 

Conversational Search expands the range of user queries handled by your AI Assistant while driving greater productivity to your teams by reducing build-time and manual authoring when creating conversational flows. Users of the Plus or Enterprise plans of watsonx Assistant can now request early access to Conversational Search. Contact your IBM Representative to get exclusive access to Conversational Search Beta or schedule a demo with one of our experts.

Learn more about this new feature by reading the recent announcement.