cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Saeid_H
by Contributor
  • 12790 Views
  • 7 replies
  • 8 kudos

What are the practical advantage of Feature Store compared to Delta Lake?

Could someone explain the practical advantages of using a feature store vs. Delta Lake. apparently they both work in the same manner and the feature store does not provide additional value. However, based on the documentation on the databricks page, ...

  • 12790 Views
  • 7 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

Hi @Saeid Hedayati​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...

  • 8 kudos
6 More Replies
philnally
by New Contributor II
  • 1657 Views
  • 3 replies
  • 0 kudos

SAP engineer wants to convert

I am looking for direction and input: I am an SAP data architect, with around 20 years of expossure to the data model itself, plus I’ve installed the SAP Data Warehouse some 6, or 7 times. I am certified in SAP’s newest analytics Tool, SAP Analytics ...

  • 1657 Views
  • 3 replies
  • 0 kudos
Latest Reply
philnally
New Contributor II
  • 0 kudos

good points..I appreciate your comments. Mike McNally

  • 0 kudos
2 More Replies
Saeid_H
by Contributor
  • 4999 Views
  • 5 replies
  • 5 kudos

How to change the feature store delta table default path on DBFS?

Hi everyone,Would it be possible to change the default storage path of deature store, during creation and/or after creation? If you could also provide the python script to that I would appreciate. The current default path is:"dbfs/user/hive/warehouse...

  • 4999 Views
  • 5 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Saeid Hedayati​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us s...

  • 5 kudos
4 More Replies
youssefmrini
by Databricks Employee
  • 1201 Views
  • 1 replies
  • 0 kudos
  • 1201 Views
  • 1 replies
  • 0 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 0 kudos

TorchDistributor is an open-source module in PySpark that helps users do distributed training with PyTorch on their Spark clusters, so it lets you launch PyTorch training jobs as Spark jobs.With Databricks Runtime 13.0 ML and above, you can perform d...

  • 0 kudos
DebK
by New Contributor III
  • 5810 Views
  • 6 replies
  • 6 kudos

Resolved! MLFlow is throwing error for the shape of input

I am running the code for prediction which will take the model from mlflow deployment. Code I have copied from the example given by mlflow experiment tab.import mlflow logged_model = 'runs:/id/model'   # Load model as a PyFuncModel. loaded_model = ml...

  • 5810 Views
  • 6 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Koushik Deb​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers y...

  • 6 kudos
5 More Replies
Hariprasad94
by New Contributor II
  • 5090 Views
  • 3 replies
  • 0 kudos

How to call a python function from displayHTML javascript code?

python - How to use IPython.notebook.kernel.execute in Azure databricks? - Stack OverflowIn standard jupyter notebook, we could use IPython.notebook.kernel.execute to call a python function, in Azure databricks IPython seems to be not exposed in brow...

  • 5090 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@hariprasad T​ :In Azure Databricks, which is a cloud-based service for Apache Spark and big data processing, the notebook environment does not expose IPython directly in the browser DOM global scope as it is done in standard Jupyter notebooks. Howev...

  • 0 kudos
2 More Replies
jk1
by New Contributor II
  • 3635 Views
  • 4 replies
  • 3 kudos

Your workspace is not currently supported for model serving because your workspace region does not match your control plane region.

Getting an error msg when creating API from Data brick for MRV(Media Rights Valuation)Your workspace is not currently supported for model serving because your workspace region does not match your control plane region. See https://docs.databricks.com/...

  • 3635 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @jk vadivel​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...

  • 3 kudos
3 More Replies
pol7451
by New Contributor
  • 994 Views
  • 2 replies
  • 0 kudos

Automating model history with multiple downstream elements

Hey, We got two models A and BModel A is fed from raw data that is firstly Clean / enriched and forecasted The results from model A are what are fed into model Bthe processes for cleaning, enriching, forecasting, model A and model B are all under ver...

  • 994 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @polly halton​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 0 kudos
1 More Replies
129464
by New Contributor
  • 1189 Views
  • 1 replies
  • 0 kudos

Partial Dependency Plots from AutoML

I have built an AutoML model, and I would like for it to add Partial Dependency Plots for the final/best model. I am trying to add this to the code generated, but not having any luck. This is the code I attempted.from sklearn.inspection import Part...

  • 1189 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@brandon.vaughn brandon.vaughn​ :It looks like you are trying to use the PartialDependenceDisplay module from the sklearn.inspection module to create partial dependency plots for your AutoML model. Here are some suggestions to ensure that the code wo...

  • 0 kudos
rendorHaevyn
by New Contributor III
  • 2575 Views
  • 4 replies
  • 0 kudos

Resolved! History of code executed on Data Science & Engineering service clusters

I want to be able to view a listing of any or all of the following:When Notebooks were attached / detached to and from a DS&E clusterWhen Notebook code was executed on a DS&E clusterWhat Notebook specific cell code was executed on a DS&E clusterIs th...

  • 2575 Views
  • 4 replies
  • 0 kudos
Latest Reply
Atanu
Databricks Employee
  • 0 kudos

From the UI https://docs.databricks.com/notebooks/notebooks-code.html#version-control best way to check is version control.BTW, do you see this helps https://www.databricks.com/blog/2022/11/02/monitoring-notebook-command-logs-static-analysis-tools.ht...

  • 0 kudos
3 More Replies
Saeid_H
by Contributor
  • 11328 Views
  • 5 replies
  • 4 kudos

Register mlflow custom model, which has pickle files

Dear community,I want to basically store 2 pickle files during the training and model registry with my keras model. So that when I access the model from another workspace (using mlflow.set_registery_uri()) , these models can be accessed as well. The ...

  • 11328 Views
  • 5 replies
  • 4 kudos
Latest Reply
arzex
New Contributor II
  • 4 kudos

آموزش تولید محتوا

  • 4 kudos
4 More Replies
Erik_S
by New Contributor II
  • 2765 Views
  • 3 replies
  • 1 kudos

Can I run a custom function that contains a trained ML model or access an API endpoint from within a SQL query in the SQL workspace?

I have a dashboard and I'd like the ability to take the data from a query and then predict a result from a trained ML model within the dashboard. I was thinking I could possibly embed the trained model within a library that I then import to the SQL w...

  • 2765 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Erik Shilts​ :Yes, it is possible to use a trained ML model in a dashboard in Databricks. Here are a few approaches you could consider:Embed the model in a Python library and call it from SQL: You can train your ML model in Python and then save it a...

  • 1 kudos
2 More Replies
Orianh
by Valued Contributor II
  • 1523 Views
  • 2 replies
  • 0 kudos

TF SummaryWriter flush() don't send any buffered data to storage.

Hey guys, I'm training a TF model in databricks, and logging to tensorboard using SummaryWriter. At the end of each epoch SummaryWriter.flush() is called which should send any buffered data into storage. But i can't see the tensorboard files while th...

  • 1523 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @orian hindi​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so w...

  • 0 kudos
1 More Replies
Eero_H
by New Contributor
  • 2972 Views
  • 2 replies
  • 1 kudos

Is there a way to change the default artifact store path on Databricks Mlflow?

I have a cloud storage mounted to Databricks and I would like to store all of the model artifacts there without specifying it when creating a new experiment.Is there a way to configure the Databricks workspace to save all of the model artifacts to a ...

  • 2972 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Eero Hiltunen​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedbac...

  • 1 kudos
1 More Replies
Kaan
by New Contributor
  • 2714 Views
  • 1 replies
  • 1 kudos

Resolved! Using databricks in multi-cloud, and querying data from the same instance.

I'm looking for a good product to use across two clouds at once for Data Engineering, Data modeling and governance. I currently have a GCP platform, but most of my data and future data goes through Azure, and currently is then transfered to GCS/BQ.Cu...

  • 2714 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Karl Andrén​ :Databricks is a great option for data engineering, data modeling, and governance across multiple clouds. It supports integrations with multiple cloud providers, including Azure, AWS, and GCP, and provides a unified interface to access ...

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels