cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Not able to invoke model external model

Himali_K
New Contributor II

I have followed below steps

1) Created serving end point by for external model gpt-4-turbo and providing azure AI endpoint and key

2) Now using langchain,  i am trying to connect and invoke message from model in notebook 

model = ChatDatabricks(target_uri="databricks", endpoint="test", temperature=0.99)
response = model.invoke(messages)

But getting error as
HTTPError: 500 Server Error: Received error from openai for url: https://{test}-c2.azuredatabricks.net/serving-endpoints/testpocC/invocations.

Any other setting is required ? or any code issue 

1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

Based on your description, you are encountering a 500 Server Error when trying to use the Langchain ChatDatabricks integration with a Databricks Serving Endpoint connected to an external OpenAI GPT-4 Turbo model on Azure. This error usually indicates an issue on the server side or with your endpoint configuration, not just the client code. Hereโ€™s how to troubleshoot:

Possible Causes and Solutions

1. Endpoint Name Mismatch

  • Double-check if your Databricks serving endpoint name matches exactly in both the Databricks UI and your code. In your error, the URL ends with /serving-endpoints/testpocC/invocations, but you used endpoint="test" in your code. This mismatch can result in a failed request.

    • Solution: Make sure endpoint="testpocC" if your actual endpoint is testpocC.

2. Model Registration and Permissions

  • Make sure the model is properly registered on Databricks and the serving endpoint is running and healthy.

  • Check that your Azure OpenAI key and endpoint are correctly configured in the Databricks UI when setting up the external model. Double-check secretsโ€™ scopes and keys, if you are using Databricks secrets.

3. API Key and Endpoint Configuration

  • Ensure the Azure OpenAI API key, resource name, and deployment name are correct.

  • If thereโ€™s a typo in the key or the endpoint, authentication will fail internally when Databricks tries to forward the request.

4. Databricks Workspace/Token Permissions

  • Confirm that your Databricks workspace user has permission to invoke the serving endpoint.

5. Langchain Integration

  • Your code snippet seems structurally correct. The main culprit is likely upstream (endpoint naming, API configuration, or permissions).

  • If the endpoint requires a different way of sending messages (e.g., invoke() expects a dict vs a string), check the Langchain documentation for the required message structure.

6. Serving Endpoint Logs

  • Check logs on the Databricks endpoint for more detailed error messages. Typically, a 500 error will be accompanied by a more descriptive log message within the Databricks "Serving Endpoints" UI.


Example Checklist

Step What to Check
Endpoint Name Correct name in both code and Databricks UI?
Endpoint URL Correctly formed (no typos, fully qualified domain)?
Azure Key/Config Secrets and keys match those provided by Azure OpenAI?
Permissions/Access User/service principal has access to serving endpoint?
Langchain Message Format messages structured per Langchain API expectations?
Endpoint Health Endpoint status in Databricks UI is "Healthy"?
Logs Look for specific error details in Databricks serving endpoint logs?
 
 

Example Code Correction

python
# Example of consistent endpoint naming: model = ChatDatabricks(target_uri="databricks", endpoint="testpocC", temperature=0.99) response = model.invoke(messages)

When to Contact Support

If after all checks (endpoint, keys, permissions, logs) you still get a 500, consider:

  • There may be a misconfiguration in Databricks external model setup.

  • Contact Databricks support with endpoint logs and Azure OpenAI validation screenshots for deeper assistance.


Summary:
The most common causes are endpoint naming mismatches, authorization/configuration errors on Databricks or Azure, or misformatted messages to Langchain. Verify your endpoint name, configuration, permissions, and examine endpoint logs for actionable errors.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now