watsonx.ai

 View Only

 Can I switch the LLM from OpenAI/Deepseek to models hosted by Watson AI in my current create_react_agent ? Is Watson AI as compatible to OpenAI as Deepseek ?

  • watsonx.ai
Gaurav Pant's profile image
Gaurav Pant posted Wed July 16, 2025 03:35 AM

Hello everyone,
I'm currently using OpenAI and Deepseek as the powering LLMs for my create_react_agent.


Is it possible to use the same ChatOpenAI class ( like what I currently use for openai and deepseek ) or ChatWatsonX is the only way for Watson AI models ( hosted and deployed by me both ) ? ( ChatWatsonx | 🦜️🔗 LangChain )

Langchain remove preview
ChatWatsonx | 🦜️🔗 LangChain
ChatWatsonx is a wrapper for IBM watsonx.ai foundation models.
View this on Langchain >

 
My current code : 

            if provider == "Deepseek":
                model = ChatOpenAI(model=model_id, openai_api_key=api_key, base_url="https://api.deepseek.com")
            else:
                model = ChatOpenAI(model=model_id, openai_api_key=api_key)

agent = create_react_agent(model,tools,system_prompt)


Ideal setup for me : 

            if provider == "Deepseek":
                model = ChatOpenAI(model=model_id, openai_api_key=api_key, base_url="https://api.deepseek.com")
            elif provider == "OpenAI":
                model = ChatOpenAI(model=model_id, openai_api_key=api_key)

else

model = ChatOpenAI(model=model_id_from_watsonx,openai_api_key=api_key_from_watsonx, base_url=base_url_watsonx)

agent = create_react_agent(model,tools,system_prompt)