cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Science & Machine Learning

Forum Posts

roman_belkin
by New Contributor II
  • 234 Views
  • 2 replies
  • 0 kudos

Gemini though Mosaic Gateway

I am trying to configure the Gemini Vertex API in Databricks. In simple Python code, everything works fine, which indicates that I have correctly set up the API and credentials. Error message: {"error_code":"INVALID_PARAMETER_VALUE","message":"INVALI...

  • 234 Views
  • 2 replies
  • 0 kudos
Latest Reply
roman_belkin
New Contributor II
  • 0 kudos

No, it seems they gave up 

  • 0 kudos
1 More Replies
yopbibo
by Contributor II
  • 1974 Views
  • 3 replies
  • 5 kudos

Deploy a ML model, trained and registered in Databricks to AKS

Hi,I can train, registered a ML Model in my Datbricks Workspace.Then, to deploy it on AKS, I need to register the model in Azure ML, and then, deploy to AKS.Is it possible to skip the Azure ML step?I would like to deploy directly into my AKS instance...

  • 1974 Views
  • 3 replies
  • 5 kudos
Latest Reply
sidharthpradhan
New Contributor
  • 5 kudos

Is it still the case, can't we serve the model in Databricks. I am new to this, so I am just wondering the capabilities.

  • 5 kudos
2 More Replies
damselfly20
by New Contributor III
  • 118 Views
  • 1 replies
  • 0 kudos

Resolved! Serving Endpoint: Container Image Creation Fails

For my RAG use case, I've registered my langchain chain as a model to Unity Catalog. When I'm trying to serve the model, container image creation fails with the following error in the build log:[...] #16 178.1 Downloading langchain_core-0.3.17-py3-no...

  • 118 Views
  • 1 replies
  • 0 kudos
Latest Reply
damselfly20
New Contributor III
  • 0 kudos

I was able to solve the problem by adding python-snappy==0.7.3 to the requirements.

  • 0 kudos
damselfly20
by New Contributor III
  • 90 Views
  • 2 replies
  • 1 kudos

Endpoint creation without scale-to-zero

Hi, I've got a question about deploying an endpoint for Llama 3.1 8b. The following code should create the endpoint without scale-to-zero. The endpoint is being created, but with scale-to-zero, although scale_to_zero_enabled is set to False. Instead ...

  • 90 Views
  • 2 replies
  • 1 kudos
Latest Reply
damselfly20
New Contributor III
  • 1 kudos

Thanks for the reply @Walter_C. This didn't quite work, since it used a CPU and didn't consider the max_provisioned_throughput, but I finally got it to work like this: from mlflow.deployments import get_deploy_client client = get_deploy_client("data...

  • 1 kudos
1 More Replies
xgbeast
by New Contributor
  • 619 Views
  • 1 replies
  • 0 kudos

What's the recommended way to scale XGBoost/LGBM to datasets that don't fit in memory ?

I'm looking to scale xgboost to large datasets which won't fit in memory on a single large EC2 instance (billions to tens of billions of rows scale). I also require many of the bells & whistles of regular in-memory xgboost slash lightgbm including:Th...

  • 619 Views
  • 1 replies
  • 0 kudos
Latest Reply
michaelthwan
New Contributor
  • 0 kudos

Very insightful writeup, thanks. I wish somebody who is experienced in large scale xgboost / lightgbm usage will share more. Encountered a similar problem to me. 

  • 0 kudos
NielsMH
by New Contributor III
  • 148 Views
  • 1 replies
  • 0 kudos

spark_session invocation from executor side error, when using sparkXGBregressor and fe client

Hi I have created a model and pipeline using xgboost.spark's sparkXGBregressor and pyspark.ml's Pipeline instance. However, i run into a "RuntimeError: _get_spark_session should not be invoked from executor side." when i try to save the predictions i...

  • 148 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

The error you're encountering is due to attempting to access the Spark session on the executor side, which is not allowed in Spark's distributed computing model. This typically happens when trying to use Spark-specific functionality within a UDF or d...

  • 0 kudos
cmilligan
by Contributor II
  • 5366 Views
  • 5 replies
  • 2 kudos

Issue with Multi-column In predicates are not supported in the DELETE condition.

I'm trying to delete rows from a table with the same date or id as records in another table. I'm using the below query and get the error 'Multi-column In predicates are not supported in the DELETE condition'. delete from cost_model.cm_dispatch_consol...

  • 5366 Views
  • 5 replies
  • 2 kudos
Latest Reply
thisisthemurph
New Contributor
  • 2 kudos

I seem to get this error on some DeltaTables and not others:df.createOrReplaceTempView("channels_to_delete") spark.sql(""" delete from lake.something.earnings where TenantId = :tenantId and ChannelId = in ( select ChannelId ...

  • 2 kudos
4 More Replies
amirA
by New Contributor II
  • 843 Views
  • 3 replies
  • 1 kudos

Resolved! Extracting Topics From Text Data Using PySpark

Hi EveryoneI tried to follow the same steps in Topic from Text on similar data as example. However, when I tri to fit the model with data I get this error.IllegalArgumentException: requirement failed: Column features must be of type equal to one of t...

  • 843 Views
  • 3 replies
  • 1 kudos
Latest Reply
filipniziol
Contributor
  • 1 kudos

Hi @amirA ,The LDA model expects the features column to be of type Vector from the pyspark.ml.linalg module, specifically either a SparseVector or DenseVector, whereas you have provided Row type.You need to convert your Row object to SparseVector.Che...

  • 1 kudos
2 More Replies
ukaplan
by New Contributor III
  • 2497 Views
  • 11 replies
  • 1 kudos

Serving Endpoint Container Image Creation Fails

Hello, I trained a model using MLFlow, and saved the model as an artifact. I can load the model from a notebook and it works as expected (i.e. I can load the model using its URI).However, when I want to deploy it using Databricks endpoints, container...

  • 2497 Views
  • 11 replies
  • 1 kudos
Latest Reply
damselfly20
New Contributor III
  • 1 kudos

@ivan_calvo The problem still exists. Surely there has to be some other option than downgrading the ML cluster to DBR 14.3 LTS ML?

  • 1 kudos
10 More Replies
Swappatil2506
by New Contributor II
  • 192 Views
  • 2 replies
  • 0 kudos

I want to develop an automated lead allocation system to prospect sales representatives.

I want to develop an automated lead allocation system to prospect sales representatives. Please suggest a suitable solution also any links if available.

  • 192 Views
  • 2 replies
  • 0 kudos
Latest Reply
Swappatil2506
New Contributor II
  • 0 kudos

Hi jamesl,My use case is related to match the prospect sales agent for the customer entering retail store, when a customer enters a store based on the inputs provided and checking on if the customer is existing or new customer, I want to create a rea...

  • 0 kudos
1 More Replies
llmnerd
by New Contributor
  • 131 Views
  • 0 replies
  • 0 kudos

UDF LLM DataBrick pickle error

Hi there,I am trying to parellize a text extraction via the Databrick foundational model.Any pointers to suggestions or examples are welcomeThe code and error below.model = "databricks-meta-llama-3-1-70b-instruct" temperature=0.0 max_tokens=1024 sch...

  • 131 Views
  • 0 replies
  • 0 kudos
kishan_
by New Contributor II
  • 245 Views
  • 2 replies
  • 1 kudos

Facing issues with passing memory checkpointer in lanngraph agents

Hi,I am trying to create a simple langgraph agent in Databricks, the agent also uses lanngraph memory checkpoint which enables to store the state of the graph. This is working fine when I am trying it in Databricks notebook, but when I tried to log t...

Machine Learning
langgraph
mlflow
  • 245 Views
  • 2 replies
  • 1 kudos
Latest Reply
morenoj11
New Contributor II
  • 1 kudos

I saw that you can compile the model without checkpointer, register it in MLflow, and then, after loading, assign it after compilation.```import mlflow mlflow.models.set_model(build_graph())with mlflow.start_run() as run_id:model_info = mlflow.langch...

  • 1 kudos
1 More Replies
hawa
by New Contributor II
  • 136 Views
  • 1 replies
  • 0 kudos

Problem serving a langchain model on Databricks

Hi, I've encountered a problem of serving a langchain model I just created successfully on Databricks.I was using the following code to set up a model in unity catalog:from mlflow.models import infer_signatureimport mlflowimport langchainmlflow.set_r...

  • 136 Views
  • 1 replies
  • 0 kudos
Latest Reply
hawa
New Contributor II
  • 0 kudos

I suspected the issue is coming from this small error I got: Got error: Must specify a chain Type in config. I used the chain_type="stuff" when building the langchain but I'm not sure how to fix it.

  • 0 kudos
Steven_Roy
by New Contributor II
  • 226 Views
  • 1 replies
  • 0 kudos

Just Passed Databricks-Machine-Learning-Professional exam

Hi guys,I have the ML Professional exam scehduled for later this month and while I can find many resources, practice exams, and posts related to the ML Associate exam, I'm having trouble finding the same for the Professional exam.Anyone happen to hav...

  • 226 Views
  • 1 replies
  • 0 kudos
Latest Reply
jinn
New Contributor II
  • 0 kudos

I got 90% on the Databricks-Machine-Learning-Professional exam, I am pleased with my results and thankful to this Site It provides premium quality service and has all the resources available. 

  • 0 kudos
javeed
by New Contributor
  • 132 Views
  • 0 replies
  • 0 kudos

Convert the tensorflow datatset to numpy tuples

Hello everyone ,Here are the sequence of steps i have followed:1. I have used petastorm to convert the spark dataframe to tf.datasetimport numpy as np# Read the Petastorm dataset and convert it to TensorFlow Datasetwith converter.make_tf_dataset() as...

  • 132 Views
  • 0 replies
  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels