2 weeks ago
I'm getting the following error message "{"error_code": "INTERNAL_ERROR", "message": "The server received an invalid response from an upstream server."}" when making a call to bge-large-en embedding model.
2 weeks ago
Seems to me like a rate limit issue. Can you please confirm if the rate limit is not zero.
2 weeks ago
I don't see that option on my serving endpoint.
Also, if there's a rate limit, I'd expect to receive a 429 with relevant message and not 500.
2 weeks ago
Can you please click on the edit AI gateway it will take you show you the rate limit and share the screenshot
2 weeks ago
no rate limitation is enabled
2 weeks ago
Yeah your rate limit seems to be good. Can you also check the following points.
1. Use the Databricks-specific name (e.g., databricks-bge-large-en), not the Hugging Face model name. Check in Serving โ Endpoints.
2. Validate Payload Format
{ "input": "text to embed" }
3. Test via Databricks UIUse Query endpoint in the Serving page. If that works, issue is client config.
2 weeks ago
I have been using databricks-bge-large-en.
Not sure what is meant by client config. I've been using this model for 2 years now. Most of the time it works but today and two days ago is stopped working intermittedly.
a week ago
Problem was not resolved.. Any thought as to what else could have happened?
a week ago
Hi @tefrati - Can you please go to the model serving endpoint and click on the use drop down (as shown in the picture below) and try the simple python or sql and see if that works for you.
from openai import OpenAI
import os
# How to get your Databricks token: https://docs.databricks.com/en/dev-tools/auth/pat.html
DATABRICKS_TOKEN = os.environ.get('DATABRICKS_TOKEN')
# Alternatively in a Databricks notebook you can use this:
# DATABRICKS_TOKEN = dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiToken().get()
client = OpenAI(
api_key=DATABRICKS_TOKEN,
base_url="https://e2-demo-field-eng.cloud.databricks.com/serving-endpoints"
)
embeddings = client.embeddings.create(
input='Your string for the embedding model goes here',
model="databricks-bge-large-en"
)
print(embeddings.data[0].embedding)
SELECT ai_query('databricks-bge-large-en',
request => "<Please provide your input string here!>")
Thursday
Hi there,
I've ran the python script and got the vector back. Also, I've been using the model and the endpoint for 2 years now so no reason it shouldn't work.
However, I'm getting the same error again now. This time, Databricks select query:
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now