cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Update model serving endpoint

pablobd
Contributor II

Hi all,

I've been able to create a model serving endpoint through the api, following the docs, but then when trying to update the version, I get the following error:

'{"error_code":"RESOURCE_ALREADY_EXISTS","message":"Endpoint with name \'ml-project\' already exists."}'

Appreciate you help and insights.
Thanks
2 REPLIES 2

Aaron12
New Contributor III

I was also searching for a solution to this problem when I came across your similar issue. This approach seems to work well. It either updates the config of the existing endpoint, or creates a new one if it doesn't exist.

# Define the endpoint config
data = {
    "name": endpoint_name,
    "config": {
        "served_models": [
            {
                "model_name": model_name,
                "model_version": model_version,
                "workload_size": "Small",
                "scale_to_zero_enabled": True,
                "environment_vars": {
                    "OPENAI_API_KEY": f"{{{{secrets/{secret_scope_name}/{secret_key_name}}}}}"
                }
            }
        ]
    }
}

# Get list of active endpoints
endpoint_list_response = requests.get(
    url=f"https://{db_host_url}/api/2.0/serving-endpoints",
    headers=headers
)

# Check if endpoint already exists
endpoints = endpoint_list_response.json().get("endpoints", [])
endpoint_exists = any(ep['name'] == endpoint_name for ep in endpoints)
print("Endpoint exists:", endpoint_exists)

# Update endpoint config if it exists
if endpoint_exists:
    update_response = requests.put(
        url=f"https://{db_host_url}/api/2.0/serving-endpoints/{endpoint_name}/config",
        json=data["config"],
        headers=headers
    )

    print("Update Response status:", update_response.status_code)
    print("Update Response text:", update_response.text, "\n")

# Create endpoint if it doesn't exist
else:
    create_response = requests.post(
        url=f"https://{db_host_url}/api/2.0/serving-endpoints",
        json=data,
        headers=headers
    )

    print("Create Response status:", create_response.status_code)
    print("Create Response text:", create_response.text, "\n")

BR_DatabricksAI
Contributor

Folks, Alternate way you can also deploy the models in serving layer with different versions. Though I am using mLflow. 

You may also refer to the below link if it its helpful 

How to Quickly Deploy, Test & Manage ML Models as REST Endpoints with Databricks - The Databricks Bl...

 

Thanks. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group