I am trying to use CrewAI to train an agent with a Databricks-hosted LLM (such as Llama 2 instruct or DBRX) via the databricks_langchain integration. I want CrewAI to use only my Databricks LLM and not requiring OpenAI authentication.
However, when I use ChatDatabricks to both the Agent and the Crew, CrewAI still tries to use OpenAI and fails with an authentication error. If I try to use custom llm, I get litellm.AuthenticationError: databricksException.
- What is the correct way to configure CrewAI to use only a Databricks-hosted LLM ?
- Should I pass the LLM only to the Agent, or also to the Crew?
- Are there any best practices or working examples for this setup in Databricks notebooks?
Thank you!