I have followed below steps
1) Created serving end point by for external model gpt-4-turbo and providing azure AI endpoint and key
2) Now using langchain, i am trying to connect and invoke message from model in notebook
model = ChatDatabricks(target_uri="databricks", endpoint="test", temperature=0.99)
response = model.invoke(messages)
But getting error as
HTTPError: 500 Server Error: Received error from openai for url: https://{test}-c2.azuredatabricks.net/serving-endpoints/testpocC/invocations.
Any other setting is required ? or any code issue