cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mbejarano89
by New Contributor III
  • 3780 Views
  • 6 replies
  • 1 kudos

Run a Databricks notebook from another notebook with ipywidget

0I am trying to run a notebook from another notebook using the dbutils.notebook.run as follows:import ipywidgets as widgetsfrom ipywidgets import interactfrom ipywidgets import Boxbutton = widgets.Button(description='Run model')out = widgets.Output()...

  • 3780 Views
  • 6 replies
  • 1 kudos
Latest Reply
Sreekanth_N
New Contributor II
  • 1 kudos

As I could see the pyspark stream is not supporting this setContext, ideally it should have alternative approach. please suggest what is approach where pyspark stream is internally calling to another notebook parallel

  • 1 kudos
5 More Replies
AmanJain1008
by New Contributor
  • 1474 Views
  • 1 replies
  • 0 kudos

Mlflow Error in Databricks notebooks

Getting this error in experiments tab of databricks notebook.There was an error loading the runs. The experiment resource may no longer exist or you no longer have permission to access it. here is the code I am usingmlflow.tensorflow.autolog() with m...

AmanJain1008_0-1692877356155.png
  • 1474 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kumaran
Databricks Employee
  • 0 kudos

Hi @AmanJain1008,Thank you for posting your question in the Databricks Community.Could you kindly check whether you are able to reproduce the issue with the below code examples: # Import Libraries import pandas as pd import numpy as np import mlflow ...

  • 0 kudos
JefferyReichman
by New Contributor III
  • 4858 Views
  • 4 replies
  • 2 kudos

Resolved! How to load data using Sparklyr

Databricks Community New to Databricks, and R User and trying to figure out how to load a hive table via Sparklyr. The path to the file is https://databricks.xxx.xx.gov/#table/xxx_mydata/mydata_etl  (right clicking on the file). I trieddata_tbl <- tb...

  • 4858 Views
  • 4 replies
  • 2 kudos
Latest Reply
Kumaran
Databricks Employee
  • 2 kudos

Hi @JefferyReichman,Not sure that I completely understood your last question about "where I can read up on this for getting started". However, you can start by running this code in the Databricks community edition notebook.For more details: Link

  • 2 kudos
3 More Replies
shan_chandra
by Databricks Employee
  • 6171 Views
  • 1 replies
  • 1 kudos

Resolved! Importing TensorFlow is giving an error when running ML model

Error stack trace:TypeError: Descriptors cannot not be created directly. If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0. If you cannot immediately regenerate your protos, some o...

  • 6171 Views
  • 1 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

Please find the below resolution:Install a protobuf version >3.20 on the cluster. pinned the protobuf==3.20.1 on the Cluster librariesReference: https://github.com/tensorflow/tensorflow/issues/60320

  • 1 kudos
Ariane
by New Contributor II
  • 1393 Views
  • 3 replies
  • 0 kudos

Databricks Feature stores

After exploring the feature store and how it works I have some concerns1. With each data refresh, there are possibilities for a change in feature values. Does Databricks feature store allow to alter the feature table in case the feature values have c...

  • 1393 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kumaran
Databricks Employee
  • 0 kudos

Hello @Ariane,Could you check the same by downloading the ebook : The Comprehensive Guide to Feature Stores here ? 

  • 0 kudos
2 More Replies
imgaboy
by New Contributor III
  • 8928 Views
  • 8 replies
  • 5 kudos

Spark with LSTM

I am still lost on the Spark and Deep Learning model.If I have a (2D) time series that I want to use for e.g. an LSTM model. Then I first convert it to a 3D array and then pass it to the model. This is normally done in memory with numpy. But what hap...

  • 8928 Views
  • 8 replies
  • 5 kudos
Latest Reply
__paolo__
New Contributor II
  • 5 kudos

Hi!I guess you've already solved this issue (your question has been posted more than 1 year ago), but maybe you could be interested in readinghttps://learn.microsoft.com/en-gb/azure/databricks/machine-learning/train-model/dl-best-practicesThere are s...

  • 5 kudos
7 More Replies
Himanshu_Rana
by New Contributor
  • 2178 Views
  • 1 replies
  • 0 kudos

AI

Today there is trending AI more than other technology and we know that it can go vast so that human get benefits fom this like in EV | Smart homes | Highly Optimized PC and  in Robotics which is growing rapidly because of bbom in AI.

  • 2178 Views
  • 1 replies
  • 0 kudos
Latest Reply
JesseFlores
New Contributor II
  • 0 kudos

Yes, I agreed with you.

  • 0 kudos
vysakhthek
by New Contributor
  • 3101 Views
  • 2 replies
  • 0 kudos

Resolved! Inquiry About Free Voucher or 75% off Voucher Availability

I am interestd in the Databricks Machine Learning Associate Certification Examination. Any ongoing event vouchers, discounts, or free voucher opportunities available for the Databricks Machine Learning Associate Examination?I would greatly appreciate...

Machine Learning
Events
machine learning
voucher
  • 3101 Views
  • 2 replies
  • 0 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 0 kudos

Indeed you will have a 50% discount

  • 0 kudos
1 More Replies
manupmanoos
by New Contributor III
  • 5049 Views
  • 5 replies
  • 4 kudos

Resolved! How can I save a keras model from a python notebook in databricks to an s3 bucket?

I have a trained model on Databricks python notebook. How can I save this to an s3 bucket.

  • 5049 Views
  • 5 replies
  • 4 kudos
Latest Reply
Kumaran
Databricks Employee
  • 4 kudos

Hi @manupmanoos,Please check the below code on how to load the saved model back from the s3 bucketimport boto3 import os from keras.models import load_model # Set credentials and create S3 client aws_access_key_id = dbutils.secrets.get(scope="<scope...

  • 4 kudos
4 More Replies
jcapplefields88
by New Contributor II
  • 1913 Views
  • 1 replies
  • 1 kudos

Expose low latency APIs from Deltalake for mobile apps and microservices

My company is using Deltalake to extract customer insights and run batch scoring with ML models. I need to expose this data to some microservices thru gRPC and REST APIs. How to do this? I'm thinking to build Spark pipelines to extract teh data, stor...

  • 1913 Views
  • 1 replies
  • 1 kudos
Latest Reply
stefnhuy
New Contributor III
  • 1 kudos

Hey everyone It's awesome that your company is utilizing Deltalake for extracting customer insights and running batch scoring with ML models. I can totally relate to the excitement and challenges of dealing with data integration for microservices and...

  • 1 kudos
aishashok
by New Contributor
  • 859 Views
  • 1 replies
  • 0 kudos

ML for personal use

Will I be able yo use the new LakeHouse products like IQ for personal use like portfolio’s and websites?

  • 859 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kumaran
Databricks Employee
  • 0 kudos

Hi @aishashok,Thank you for posting your question in the Databricks community.Yes, Databricks' new Lakehouse products like Databricks SQL Analytics, SQL Runtime, and Delta Lake can be used for a variety of data engineering and analytics use cases, in...

  • 0 kudos
Rajaniesh
by New Contributor III
  • 6308 Views
  • 2 replies
  • 3 kudos

Databricks assistant not enabling

 Hi,I have gone thru the databricks assistant article by Databricks https://docs.databricks.com/notebooks/notebook-assistant-faq.htmlIt clearly states that :Q: How do I enable Databricks Assistant?An account administrator must enable Databricks Assis...

  • 6308 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kumaran
Databricks Employee
  • 3 kudos

Hi @Rajaniesh,Databricks assistant is available now live. Please check the below blog for more details.More_details  

  • 3 kudos
1 More Replies
SOlivero
by New Contributor III
  • 3931 Views
  • 3 replies
  • 3 kudos

Load a pyfunc model logged with Feature Store

Hi, I'm using Databricks Feature Store to register a custom model using a model wrapper as follows: # Log custom model to MLflow fs.log_model( artifact_path="model", model = production_model, flavor = mlflow.pyfunc, training_set = training_s...

  • 3931 Views
  • 3 replies
  • 3 kudos
Latest Reply
Kumaran
Databricks Employee
  • 3 kudos

Hi @SOlivero Make sure that the model was in fact saved with the provided URI.The latest keyword will retrieve the latest version of the registered model when mlflow.pyfunc.load_model('models:/model_name/latest') is executed, not the highest version....

  • 3 kudos
2 More Replies
EmirHodzic
by New Contributor II
  • 1780 Views
  • 2 replies
  • 3 kudos

Resolved! Hyperopt Ray integration

Hello,Is there a way to integrate Hyperopt with Ray parallelisation? I have a simulation framework which I want to optimise, and each simulation run is set up to be a Ray process, however I am calling one simulation run in the objective function. Thi...

  • 1780 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kumaran
Databricks Employee
  • 3 kudos

Hi @EmirHodzic Thank you for posting your question in the Databricks community. You can use Ray Tune, a tuning library that integrates with Ray, to parallelize your Hyperopt trials across multiple nodes.Here's a link to the documentation for HyperOpt...

  • 3 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels