cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RobinK
by Contributor
  • 1076 Views
  • 1 replies
  • 1 kudos

Resolved! Vectorsearch ConnectionResetError Max retries exceeded

Hi,we are serving a unity catalog langchain model with databricks model serving. When I run the predict() function on the model in a notebook, I get the expected output. But when I query the served model, errors occur in the service logs:Error messag...

  • 1076 Views
  • 1 replies
  • 1 kudos
Latest Reply
RobinK
Contributor
  • 1 kudos

downgrading langchain-community to version 0.2.4 solved my problem.

  • 1 kudos
rahuja
by Contributor
  • 575 Views
  • 1 replies
  • 0 kudos

Create Databricks Dashboards on MLFlow Metrics

HelloCurrently we have multiple ML models running in Production which are logging metrics and other meta-data on mlflow. I wanted to ask is it possible somehow to build Databricks dashboards on top of this data and also can this data be somehow avail...

  • 575 Views
  • 1 replies
  • 0 kudos
Latest Reply
rahuja
Contributor
  • 0 kudos

Hello @Retired_mod Thanks for responding. I think you  are talking about using the Python API. But we don't want that is it possible since MLFlow also uses an sql table to store metrics. To expose those tables as a part of our meta-store and build da...

  • 0 kudos
adrianna2942842
by New Contributor III
  • 958 Views
  • 1 replies
  • 0 kudos

Deployment with model serving failed after entering "DEPLOYMENT_READY" state

Hi, I was trying to update a config for an endpoint, by adding a new version of an entity (version 7). The new model entered "DEPLOYMENT_READY" state, but the deployment failed with timed out exception. I didn't get any other exception in Build or Se...

deployment_fail.PNG deployment_failed2.PNG
  • 958 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kumaran
Databricks Employee
  • 0 kudos

Hi @adrianna2942842, Thank you for contacting the Databricks community. May I know how you are loading the model?

  • 0 kudos
simranisanewbie
by New Contributor
  • 1010 Views
  • 0 replies
  • 0 kudos

Pyspark custom Transformer class -AttributeError: 'DummyMod' object has no attribute 'MyTransformer'

I am trying to create a custom transformer as a stage in my pipeline. A few of the transformations I am doing via SparkNLP and the next few using MLlib. To pass the result of SparkNLP transformation at a stage to the next MLlib transformation, I need...

  • 1010 Views
  • 0 replies
  • 0 kudos
rasgaard
by New Contributor
  • 1422 Views
  • 1 replies
  • 0 kudos

Model Serving Endpoints - Build configuration and Interactive access

Hi there I have used the Databricks Model Serving Endpoints to serve a model which depends on some config files and a custom library. The library has been included by logging the model with the `code_path` argument in `mlflow.pyfunc.log_model` and it...

  • 1422 Views
  • 1 replies
  • 0 kudos
Latest Reply
robbe
New Contributor III
  • 0 kudos

Hi @rasgaard, one way to achieve that without inspecting the container is to use MLflow artifacts. Artifacts allow you to log files together with your models and reference them inside the endpoint.For example, let's assume that you need to include a ...

  • 0 kudos
John22
by New Contributor
  • 1654 Views
  • 1 replies
  • 0 kudos

EasyOcr Endpoint not accepting inputs

Hi all! I am trying to create an endpoint for Easy OCR. I was able to create the experiment using a wrapper class with the code below:  # import libraries import mlflow import mlflow.pyfunc import cloudpickle import cv2 import re import easyocr impo...

  • 1654 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kumaran
Databricks Employee
  • 0 kudos

Hi @John22, Thank you for posting your question on the Databricks community. First, are you able to infer the output within the notebook itself? Which cloud are you on AWS or Azure?

  • 0 kudos
amal15
by New Contributor II
  • 1478 Views
  • 1 replies
  • 0 kudos

error: not found: type XGBoostEstimator

error: not found: type XGBoostEstimator Spark & Scala  

  • 1478 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@amal15 - can you please include the below to the import statement and see if it works. ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator 

  • 0 kudos
BogdanV
by New Contributor III
  • 2918 Views
  • 1 replies
  • 0 kudos

Resolved! Query ML Endpoint with R and Curl

I am trying to get a prediction by querying the ML Endpoint on Azure Databricks with R. I'm not sure what is the format of the expected data. Is there any other problem with this code? Thanks!!! 

R Code.png
  • 2918 Views
  • 1 replies
  • 0 kudos
Latest Reply
BogdanV
New Contributor III
  • 0 kudos

Hi Kaniz, I was able to find the solution. You should post this in the examples when you click "Query Endpoint"You only have code for Browser, Curl, Python, SQL. You should add a tab for RHere is the solution:library(httr)url <- "https://adb-********...

  • 0 kudos
prafull
by New Contributor
  • 1520 Views
  • 0 replies
  • 0 kudos

Create Serving Endpoint with JAVA Runtime

Hello,Trying to create a custom serving endpoint, using artifacts argument while logging the run/model to save .jar files. These files are called during when calling .predict. JAVA runtime 8 or higher is required to run the jar file, not sure how to ...

Machine Learning
ML
mlflow
model
python
  • 1520 Views
  • 0 replies
  • 0 kudos
GKH
by New Contributor II
  • 2495 Views
  • 1 replies
  • 1 kudos

Errors using Dolly Deployed as a REST API

We have deployed Dolly (https://huggingface.co/databricks/dolly-v2-3b) as a REST API endpoint on our infrastructure. The notebook we used to do this is included in the text below my question.The Databricks infra used had the following config -  (13.2...

  • 2495 Views
  • 1 replies
  • 1 kudos
Latest Reply
marcelo2108
Contributor
  • 1 kudos

I had a similar problem when I used HuggingFacePipeline(pipeline=generate_text) with langchain. It worked to me when I tried to use HuggingFaceHub instead. I used the same dolly-3b model.

  • 1 kudos
yhyhy3
by New Contributor III
  • 1152 Views
  • 1 replies
  • 0 kudos

Foundation Model APIs HIPAA compliance

I saw that Foundation Model API  is not HIPAA compliant. Is there a timeline in which we could expect it to be HIPAA compliant? I work for a healthcare company with a BAA with Databricks.

  • 1152 Views
  • 1 replies
  • 0 kudos
Latest Reply
saikumar246
Databricks Employee
  • 0 kudos

Hi @yhyhy3  Foundation Model API's HIPAA certification:AWS: e.t.a. March 2024Azure: e.t.a. Aug 2024 HIPAA certification is essentially having a third party audit report for HIPAA.  That is not the date that a HIPAA product offering may/will necessari...

  • 0 kudos
prafull
by New Contributor
  • 1315 Views
  • 0 replies
  • 0 kudos

How to use mlflow to log a composite estimator (multiple pipes) and then deploy it as rest endpoint

Hello,I am trying to deploy a composite estimator as single model, by logging the run with mlflow and registering the model.Can anyone help with how this can be done? This estimator contains different chains-text: data- tfidf- svm- svm.decision_funct...

Screenshot 2024-01-17 000758.png
Machine Learning
ML
mlflow
model
python
  • 1315 Views
  • 0 replies
  • 0 kudos
tessaickx
by New Contributor III
  • 3258 Views
  • 2 replies
  • 0 kudos

Serving endpoints: model server failed to load the model: the file bash was not found: uknown

While trying to create a serving endpoint with my custom model, I get a "Failed" state:Model server failed to load the model. Please see service logs for more information.The service logs show the following:Container failed with: failed to create con...

  • 3258 Views
  • 2 replies
  • 0 kudos
Latest Reply
ravi-malipeddi
New Contributor II
  • 0 kudos

I have faced the similar issue. still didn't find the right solution. In my case, the below is the error trace i found from service logs. Not sure where the issue could be"An error occurred while loading the model. You haven't configured the CLI yet!...

  • 0 kudos
1 More Replies
YanivShani
by New Contributor
  • 2023 Views
  • 1 replies
  • 0 kudos

inference table not working

Hi,I'm trying to enable inference table for my llama_2_7b_hf serving endpoint, however I'm getting the following error:"Inference tables are currently not available with accelerated inference." Anyone one have an idea on how to overcome this issue? C...

  • 2023 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

From the information you provided, it seems like you are trying to enable inference tables for an existing endpoint. However, the error message suggests that this feature may not be supported with accelerated inference.If you have previously disabled...

  • 0 kudos
MaKarenina
by New Contributor
  • 1446 Views
  • 0 replies
  • 0 kudos

ML Flow until January 24

Hi! When i was creating a new endpoint a have this alert  CREATE A MODEL SERVING ENDPOINT TO SERVE YOUR MODEL BEHIND A REST API INTERFACE. YOU CAN STILL USE LEGACY ML FLOW MODEL SERVING UNTIL JANUARY 2024 I don't understand if my Legacy MLFlow Model ...

  • 1446 Views
  • 0 replies
  • 0 kudos
Labels