cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

antonionuzzo
by New Contributor
  • 142 Views
  • 1 replies
  • 0 kudos

Resolved! Exploring Serverless Features in Databricks for ML Use Cases

Hello, I need to develop some ML use case. I would like to understand if the serverless functionality unlocks any additional features or if it is mandatory for certain capabilities.Thank you!

Machine Learning
machine learning
serverless
  • 142 Views
  • 1 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Serverless functionality in Databricks is not mandatory for utilizing machine learning (ML) capabilities. However, it does unlock specific benefits and features that can enhance certain workflows. Here’s how serverless compute can add value, based on...

  • 0 kudos
moseb
by New Contributor
  • 84 Views
  • 0 replies
  • 0 kudos

Problem with ipywidgets and plotly on Databricks

Hi everyone, I am encountering a problem when using ipywidgets with plotly on Databricks. I am trying to pass interactive arguments to a function and then plot with plotly. When I do the followingdef f(m, b) :    plt.figure(2)    x = np.linspace(-10,...

  • 84 Views
  • 0 replies
  • 0 kudos
hawa
by New Contributor II
  • 1755 Views
  • 3 replies
  • 2 kudos

Problem serving a langchain model on Databricks

Hi, I've encountered a problem of serving a langchain model I just created successfully on Databricks.I was using the following code to set up a model in unity catalog:from mlflow.models import infer_signatureimport mlflowimport langchainmlflow.set_r...

  • 1755 Views
  • 3 replies
  • 2 kudos
Latest Reply
hawa
New Contributor II
  • 2 kudos

I suspected the issue is coming from this small error I got: Got error: Must specify a chain Type in config. I used the chain_type="stuff" when building the langchain but I'm not sure how to fix it.

  • 2 kudos
2 More Replies
Mado
by Valued Contributor II
  • 5300 Views
  • 1 replies
  • 4 kudos

Error when reading Excel file: "org.apache.poi.ooxml.POIXMLException: Strict OOXML isn't currently supported, please see bug #57699"

Hi,I want to read an Excel "xlsx" file. The excel file has several sheets and multi-row header. The original file format was "xlsm" and I changed the extension to "xlsx". I try the following code:filepath_xlsx = "dbfs:/FileStore/Sample_Excel/data.xl...

  • 5300 Views
  • 1 replies
  • 4 kudos
Latest Reply
Eag_le
New Contributor II
  • 4 kudos

copying the data onto a newer file solved my issue. Likely issue related to files metadata!   

  • 4 kudos
dcunningham1
by New Contributor III
  • 2453 Views
  • 6 replies
  • 4 kudos

Possible to use `params` argument of `mlflow.pyfunc.PythonModel` deployed to Databricks endpoint?

I'm deploying a custom model using the `mlflow.pyfunc.PythonModel` class as described here. My model uses the `params` argument in the `predict` method to allow the user to choose some aspects of the model at inference time. For example:class CustomM...

  • 2453 Views
  • 6 replies
  • 4 kudos
Latest Reply
KAdamatzky
New Contributor III
  • 4 kudos

That's great to hear @umesh_gattem ! Are you able to provide some example code for your predict function?I have tried specifying the params as part of the data_json request, but they are not being recognised by the model when it is called this way.Fo...

  • 4 kudos
5 More Replies
imgaboy
by New Contributor III
  • 13867 Views
  • 9 replies
  • 6 kudos

Spark with LSTM

I am still lost on the Spark and Deep Learning model.If I have a (2D) time series that I want to use for e.g. an LSTM model. Then I first convert it to a 3D array and then pass it to the model. This is normally done in memory with numpy. But what hap...

  • 13867 Views
  • 9 replies
  • 6 kudos
Latest Reply
JohnyBe
New Contributor II
  • 6 kudos

Same problem as @imgaboy here, is the solution was to save into table our inputs after formating them ready to feed the lstm and just turn 2d to 3d via datagenerator??

  • 6 kudos
8 More Replies
dkxxx-rc
by Contributor
  • 1740 Views
  • 6 replies
  • 3 kudos

Resolved! Nested runs don't group correctly in MLflow

How do I get MLflow child runs to appear as children of their parent run in the MLflow GUI, if I'm choosing my own experiment location instead of letting everything be written to the default experiment location?If I run the standard tutorial (https:/...

dkxxxrc_0-1736289524445.png
  • 1740 Views
  • 6 replies
  • 3 kudos
Latest Reply
dkxxx-rc
Contributor
  • 3 kudos

OK, here's more info about what's wrong, and a solution.I used additional parameter logging to determine that no matter how I adjust the parameters of the inner call to ```mlflow.start_run()```the `experiment_id` parameter of the child runs differs f...

  • 3 kudos
5 More Replies
sagarb
by New Contributor II
  • 659 Views
  • 2 replies
  • 0 kudos

GitHub Actions workflow cannot find the Databricks Unity Catalog and its tables

Context: Running the train_model_py.py file stored in Databricks through GitHub Actions. The notebook reads the Unity Catalog tables for pre-processing and works fine when run through the Databricks UI. However, it gives an error when run through Git...

  • 659 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @sagarb, It sounds like a permission issue or setup issue... what is the error you are hitting?

  • 0 kudos
1 More Replies
MLOperator
by New Contributor II
  • 1203 Views
  • 1 replies
  • 0 kudos

Resolved! Custom model serving using Databricks Asset Bundles

I am using MLFlow to register custom model (python model) in Unity Catalog, and Databricks Asset Bundle to create a serving endpoint for that custom model. I was able to create the serving endpoint using DABs, but I want to deploy the model by using ...

  • 1203 Views
  • 1 replies
  • 0 kudos
Latest Reply
koji_kawamura
Databricks Employee
  • 0 kudos

Hi @MLOperator  Since model_serving_endpoints only accepts a version number of a served entity, I think that is not possible. However, the get-by-alias version API can be used to retrieve a version number from a model alias name.  Then the model name...

  • 0 kudos
javeed
by New Contributor
  • 1328 Views
  • 1 replies
  • 0 kudos

Convert the tensorflow datatset to numpy tuples

Hello everyone ,Here are the sequence of steps i have followed:1. I have used petastorm to convert the spark dataframe to tf.datasetimport numpy as np# Read the Petastorm dataset and convert it to TensorFlow Datasetwith converter.make_tf_dataset() as...

  • 1328 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ismael-K
Databricks Employee
  • 0 kudos

The error occurs because make_tf_dataset() returns an inferred_schema_view object, which is a Petastorm wrapper representing the dataset schema. This object does not have a .numpy() attribute, so calling batch.numpy() will throw the AttributeError.  ...

  • 0 kudos
cmd0160
by New Contributor
  • 591 Views
  • 1 replies
  • 0 kudos

Interactive EDA task in a Job Workflow

I am trying to configure an interactive EDA task as part of a job workflow. I'd like to be able to trigger a workflow, perform some basic analysis then proceed to a subsequent task. I haven't had any success freezing execution. Also, the job workflow...

  • 591 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @cmd0160, Freezing job execution to perform interactive tasks directly within a job workflow is not natively supported in Databricks. The job workflow UI and the notebook UI serve different purposes, and the interactive capabilities you find in...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels