cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

johndoe99012
by New Contributor
  • 85 Views
  • 4 replies
  • 1 kudos

How to serve a Unity Catalog ML model to external usage

Hello everyone I am following this notebook tutorial https://docs.databricks.com/en/machine-learning/manage-model-lifecycle/index.html#example-notebook Now I can register a machine learning model in Unity Catalog, but the tutorial only shows how to u...

  • 85 Views
  • 4 replies
  • 1 kudos
Latest Reply
filipniziol
Contributor III
  • 1 kudos

Hi @johndoe99012 If the answer resolved your question, please consider marking it as the solution. It helps others in the community find answers more easily.  

  • 1 kudos
3 More Replies
TinSlim
by New Contributor II
  • 769 Views
  • 3 replies
  • 0 kudos

Maximum wait time Databricks Model Serving

hi, hope you are fineI deployed a model 3 or 2 months ago using Databricks Serving and MLFlow. The model worked good using GPU from model serving.I stopped using it for some months and when I tried again deploying it, it has some errors.1. [FIXED] A ...

TinSlim_0-1733768150465.png TinSlim_1-1733768584347.png
  • 769 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Thanks, I will review it and get back. I'll DIM you.

  • 0 kudos
2 More Replies
robertol
by New Contributor II
  • 315 Views
  • 2 replies
  • 0 kudos

Resolved! Error in creating a serving endpoint: registered model not found

I have registered a custom model which loads another model in the load_context method. Everything works fine when I load (with mlflow.pyfunc.load_model) and use the model in a notebook. When I try to create a serving endpoint for it I keep becoming t...

  • 315 Views
  • 2 replies
  • 0 kudos
Latest Reply
robertol
New Contributor II
  • 0 kudos

It is registered in the Unity Catalog. I have found a complete other solution now. With the help of TransformedTargetRegressor I don't need a separate normalisation step anymore and therefore don't load a model in load_context anymore.

  • 0 kudos
1 More Replies
damselfly20
by New Contributor III
  • 189 Views
  • 2 replies
  • 1 kudos

Endpoint creation without scale-to-zero

Hi, I've got a question about deploying an endpoint for Llama 3.1 8b. The following code should create the endpoint without scale-to-zero. The endpoint is being created, but with scale-to-zero, although scale_to_zero_enabled is set to False. Instead ...

  • 189 Views
  • 2 replies
  • 1 kudos
Latest Reply
damselfly20
New Contributor III
  • 1 kudos

Thanks for the reply @Walter_C. This didn't quite work, since it used a CPU and didn't consider the max_provisioned_throughput, but I finally got it to work like this: from mlflow.deployments import get_deploy_client client = get_deploy_client("data...

  • 1 kudos
1 More Replies
hawa
by New Contributor II
  • 237 Views
  • 1 replies
  • 0 kudos

Problem serving a langchain model on Databricks

Hi, I've encountered a problem of serving a langchain model I just created successfully on Databricks.I was using the following code to set up a model in unity catalog:from mlflow.models import infer_signatureimport mlflowimport langchainmlflow.set_r...

  • 237 Views
  • 1 replies
  • 0 kudos
Latest Reply
hawa
New Contributor II
  • 0 kudos

I suspected the issue is coming from this small error I got: Got error: Must specify a chain Type in config. I used the chain_type="stuff" when building the langchain but I'm not sure how to fix it.

  • 0 kudos
gustavocavsanto
by New Contributor
  • 370 Views
  • 1 replies
  • 0 kudos

Error 401: "Missing authorization details for accessing model serving endpoints" with OAuth Token on

I am trying to generate an OAuth token for my Azure Databricks workspace to access a model serving API in production. The code I’m using generates a token successfully, but I keep receiving a 401 error with the message "Missing authorization details ...

  • 370 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

You might need to follow steps in https://docs.databricks.com/en/machine-learning/model-serving/route-optimization.html#query-route-optimized-model-serving-endpoints on how to fetch an OAuth token programmatically

  • 0 kudos
M_B
by New Contributor
  • 195 Views
  • 1 replies
  • 0 kudos

Serving model with custom scoring script to a real-time endpoint

Hi, new to databricks here and wasn't able to find relevant info in the documentation.Is it not possible to serve a model with a custom scoring script to an online endpoint on databricks to customise inference ? the customisation is related to incomi...

  • 195 Views
  • 1 replies
  • 0 kudos
Latest Reply
HaggMan
New Contributor III
  • 0 kudos

If I'm understanding, all you really want to do is have a pre/post - process function running with your model, is that correct? If so, you can do this by using the MLflow pyfunc model. Something  like they do here:https://docs.databricks.com/en/machi...

  • 0 kudos
RobinK
by Contributor
  • 388 Views
  • 1 replies
  • 0 kudos

Populate client_request_id in Model Serving inference table

Hi,The documentation for the model serving inference table states that the client_request_id column is typically null. How can I populate this column with a request ID from the calling .NET application when invoking the model via the Databricks REST ...

  • 388 Views
  • 1 replies
  • 0 kudos
Latest Reply
RobinK
Contributor
  • 0 kudos

I finally found it in the docs: https://docs.databricks.com/en/machine-learning/model-serving/inference-tables.html#specify-client_request_id { "client_request_id": "<user-provided-id>", "dataframe_records": [...]}  

  • 0 kudos
Shumi8
by New Contributor II
  • 3261 Views
  • 4 replies
  • 1 kudos

Databricks MlFlow Error: Timed out while evaluating the model.

Hi everyone,I am using databricks and mlflow to create a model and then register it as a serving endpoint. Sometimes the models takes more than 2 minutes to run and after 2 minutes it gives a timeout error:Timed out while evaluating the model. Verify...

  • 3261 Views
  • 4 replies
  • 1 kudos
Latest Reply
Aman-Patkar
New Contributor II
  • 1 kudos

Hi, Did you get the solution for this? This timeout issue.

  • 1 kudos
3 More Replies
Djay101
by New Contributor
  • 370 Views
  • 0 replies
  • 0 kudos

How do we log without a dbfs for MLFlow models.

Hi Databricks Team,We are planning a UC Migration for a customer who currently has around 500 experiments, each with multiple runs. These experiments are registered and MLflow is logging to DBFS locations. However, we have not found any documentation...

  • 370 Views
  • 0 replies
  • 0 kudos
dcunningham1
by New Contributor
  • 448 Views
  • 0 replies
  • 0 kudos

Possible to use `params` argument of `mlflow.pyfunc.PythonModel` deployed to Databricks endpoint?

I'm deploying a custom model using the `mlflow.pyfunc.PythonModel` class as described here. My model uses the `params` argument in the `predict` method to allow the user to choose some aspects of the model at inference time. For example:class CustomM...

  • 448 Views
  • 0 replies
  • 0 kudos
yorabhir
by New Contributor III
  • 426 Views
  • 0 replies
  • 0 kudos

ModuleNotFoundError: No module named 'model_train' when using mlflow.sklearn.load_model

Hello,I have multiple versions of a model registered in model registry. When I am trying to load any other version except model version 1 by mlflow.sklearn.load_model(f"models:/{model_name}/{model_version}")I am getting ModuleNotFoundError: No module...

  • 426 Views
  • 0 replies
  • 0 kudos
TSchmidt
by New Contributor
  • 594 Views
  • 0 replies
  • 0 kudos

large scale yolo inference

I have 50 Million Images sitting on s3 I have a Yolov8 model trained with ultralytics and want to run inference on those images. I suspect I should be running inference using ML flow, but I am confused on how. I don't need to track experiments/traini...

  • 594 Views
  • 0 replies
  • 0 kudos
NaeemS
by New Contributor III
  • 2993 Views
  • 8 replies
  • 0 kudos

Feature Store Model Serving endpoint

Hi,I am trying to deploy my model which was logged by featureStoreEngineering client as a serving endpoint in Databricks. But I am facing following error:   The Databricks Lookup client from databricks-feature-lookup and Databricks Feature Store clie...

  • 2993 Views
  • 8 replies
  • 0 kudos
Latest Reply
robbe
New Contributor III
  • 0 kudos

Hi @damselfly20 unfortunately I can't help much with that as I've never worked with RAGs. Are you sure it's the same error though? @NaeemS's and my errors seems to be Java related and yours MLflow related.

  • 0 kudos
7 More Replies
RobinK
by Contributor
  • 983 Views
  • 1 replies
  • 1 kudos

Resolved! Vectorsearch ConnectionResetError Max retries exceeded

Hi,we are serving a unity catalog langchain model with databricks model serving. When I run the predict() function on the model in a notebook, I get the expected output. But when I query the served model, errors occur in the service logs:Error messag...

  • 983 Views
  • 1 replies
  • 1 kudos
Latest Reply
RobinK
Contributor
  • 1 kudos

downgrading langchain-community to version 0.2.4 solved my problem.

  • 1 kudos
Labels