โ10-10-2025 12:43 PM
I'm getting the following error message "{"error_code": "INTERNAL_ERROR", "message": "The server received an invalid response from an upstream server."}" when making a call to bge-large-en embedding model.
โ10-10-2025 01:18 PM
Seems to me like a rate limit issue. Can you please confirm if the rate limit is not zero.
โ10-10-2025 03:01 PM
I don't see that option on my serving endpoint.
Also, if there's a rate limit, I'd expect to receive a 429 with relevant message and not 500.
โ10-10-2025 03:25 PM
Can you please click on the edit AI gateway it will take you show you the rate limit and share the screenshot
โ10-10-2025 03:30 PM
no rate limitation is enabled
โ10-10-2025 04:03 PM
Yeah your rate limit seems to be good. Can you also check the following points.
1. Use the Databricks-specific name (e.g., databricks-bge-large-en), not the Hugging Face model name. Check in Serving โ Endpoints.
2. Validate Payload Format
{ "input": "text to embed" }
3. Test via Databricks UIUse Query endpoint in the Serving page. If that works, issue is client config.
โ10-10-2025 09:02 PM
I have been using databricks-bge-large-en.
Not sure what is meant by client config. I've been using this model for 2 years now. Most of the time it works but today and two days ago is stopped working intermittedly.
a month ago
Problem was not resolved.. Any thought as to what else could have happened?
a month ago
Hi @tefrati - Can you please go to the model serving endpoint and click on the use drop down (as shown in the picture below) and try the simple python or sql and see if that works for you.
from openai import OpenAI
import os
# How to get your Databricks token: https://docs.databricks.com/en/dev-tools/auth/pat.html
DATABRICKS_TOKEN = os.environ.get('DATABRICKS_TOKEN')
# Alternatively in a Databricks notebook you can use this:
# DATABRICKS_TOKEN = dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiToken().get()
client = OpenAI(
api_key=DATABRICKS_TOKEN,
base_url="https://e2-demo-field-eng.cloud.databricks.com/serving-endpoints"
)
embeddings = client.embeddings.create(
input='Your string for the embedding model goes here',
model="databricks-bge-large-en"
)
print(embeddings.data[0].embedding)
SELECT ai_query('databricks-bge-large-en',
request => "<Please provide your input string here!>")
3 weeks ago
Hi there,
I've ran the python script and got the vector back. Also, I've been using the model and the endpoint for 2 years now so no reason it shouldn't work.
However, I'm getting the same error again now. This time, Databricks select query:
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now