Hello everyone,
I'm currently using OpenAI and Deepseek as the powering LLMs for my create_react_agent.
Is it possible to use the same ChatOpenAI class ( like what I currently use for openai and deepseek ) or ChatWatsonX is the only way for Watson AI models ( hosted and deployed by me both ) ? ( ChatWatsonx | 🦜️🔗 LangChain )
My current code :
if provider == "Deepseek":
model = ChatOpenAI(model=model_id, openai_api_key=api_key, base_url="https://api.deepseek.com")
else:
model = ChatOpenAI(model=model_id, openai_api_key=api_key)
agent = create_react_agent(model,tools,system_prompt)
Ideal setup for me :
if provider == "Deepseek":
model = ChatOpenAI(model=model_id, openai_api_key=api_key, base_url="https://api.deepseek.com")
elif provider == "OpenAI":
model = ChatOpenAI(model=model_id, openai_api_key=api_key)
else
model = ChatOpenAI(model=model_id_from_watsonx,openai_api_key=api_key_from_watsonx, base_url=base_url_watsonx)
agent = create_react_agent(model,tools,system_prompt)