cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sujitha
by Databricks Employee
  • 982 Views
  • 0 replies
  • 2 kudos

Don’t miss out! Data + AI Summit early bird pricing ends soon Register by February 28 to take advantage of our early bird discount. Join thousands of ...

Don’t miss out! Data + AI Summit early bird pricing ends soonRegister by February 28 to take advantage of our early bird discount. Join thousands of data engineers, data scientists and data analysts from around the world at this year’s Data + AI Summ...

  • 982 Views
  • 0 replies
  • 2 kudos
_CV
by New Contributor III
  • 3314 Views
  • 3 replies
  • 3 kudos

Resolved! I'm no longer able to import MLFlow using PYPI to automated clusters

Starting yesterday afternoon, my job clusters across different workstations started throwing an error when importing from pypi the MLFlow library upon cluster initiation and startup. I'm using an Azure Databricks automated job cluster (details below)...

  • 3314 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Chris Valley​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 3 kudos
2 More Replies
Charley
by New Contributor II
  • 6944 Views
  • 1 replies
  • 1 kudos

error status 400 calling serving model endpoint invocation using personal access token on Azure Databricks

Hi all, I've deployed a model, moved it to production and served it (mlflow), but when testing it in the python notebook I get a 400 error. code/details below:import osimport requestsimport jsonimport pandas as pdimport numpy as np# Create two record...

  • 6944 Views
  • 1 replies
  • 1 kudos
Latest Reply
nakany
New Contributor II
  • 1 kudos

data_json in the score_model function should be defined as followsds_dict = {"dataframe_split": dataset.to_dict(orient='split')} if isinstance(dataset, pd.DataFrame) else create_tf_serving_json(dataset)

  • 1 kudos
Sujitha
by Databricks Employee
  • 2057 Views
  • 5 replies
  • 1 kudos

Latest Blog PostsJanuary 13 - 20 Did you get a chance to look at the most recent blog posts? Here are some happening content from the past week that i...

Latest Blog PostsJanuary 13 - 20Did you get a chance to look at the most recent blog posts? Here are some happening content from the past week that is worth the read. What’s New With SQL User-Defined Functions In this blog, we describe several enhanc...

  • 2057 Views
  • 5 replies
  • 1 kudos
Latest Reply
Chaitanya_Raju
Honored Contributor
  • 1 kudos

Thanks @Sujitha Ramamoorthy​ , for sharing with the community these are worth reading and insightful.

  • 1 kudos
4 More Replies
Mrinmoy207
by New Contributor II
  • 2096 Views
  • 2 replies
  • 2 kudos

Number of epochs/epoch loss widget not visible while training the model

I am training a N-BEATS forecasting model using darts library. After I define all my hyper parameters and execute the code to fit my model and have set the ''verbose'' parameter to true according to the documentation to show the progress of the train...

fit() image image
  • 2096 Views
  • 2 replies
  • 2 kudos
Latest Reply
LandanG
Databricks Employee
  • 2 kudos

Hi @Mrinmoy Gupta​, what happens when you detach the notebook from the cluster (and optionally clear the state) and then rerun the code? I've seen this happen once and it was a solved by re-running the code

  • 2 kudos
1 More Replies
NSRBX
by Contributor
  • 3188 Views
  • 2 replies
  • 4 kudos

Feature Store - Feature Lookup Engine with join on partial key and Filter

Hello ,I am working with lookupEngine functions. However, we have some feature tables with granularity level most detailled of dataframe input.Please find an example :table A with unique keys on two features : numero_p, numero_s So while performing F...

  • 3188 Views
  • 2 replies
  • 4 kudos
Latest Reply
Debayan
Databricks Employee
  • 4 kudos

Hi @SERET Nathalie​ , I can check internally on the ask here. In the meantime please let us know if this helps: https://docs.databricks.com/machine-learning/feature-store/feature-tables.htmlhttps://docs.databricks.com/machine-learning/feature-store/i...

  • 4 kudos
1 More Replies
THIAM_HUATTAN
by Valued Contributor
  • 2651 Views
  • 7 replies
  • 6 kudos

Why this Databricks ML code gets stuck?

I could not paste the code here because of the some word not allowed, so I have to paste it elsewhere.Below is OK:https://justpaste.it/8xcr9But below gets stuck:https://justpaste.it/8nydtand it keeps looping and running...

  • 2651 Views
  • 7 replies
  • 6 kudos
Latest Reply
Vidula
Honored Contributor
  • 6 kudos

Hey @THIAM HUAT TAN​ Hope all is well! Just wanted to check in if you were able to resolve your issue, and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you....

  • 6 kudos
6 More Replies
Direo
by Contributor II
  • 1784 Views
  • 2 replies
  • 1 kudos

Is it possible to load MLFlow artifacts and models from local diretory to databricks DBFS?

I have been working locally and created a few models and now I want to move those to databricks/DBFS. Is it possible to do that?

  • 1784 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vivian_Wilfred
Databricks Employee
  • 1 kudos

Hi @Direo Direo​, can you check these docs and see if it helps-https://docs.databricks.com/applications/mlflow/access-hosted-tracking-server.html#access-the-mlflow-tracking-server-from-outside-databrickshttps://docs.databricks.com/applications/mlflow...

  • 1 kudos
1 More Replies
Dhara
by New Contributor III
  • 20191 Views
  • 9 replies
  • 5 kudos

Access multiple .mdb files using Python

Hi, I wanted to access multiple .mdb access files which are stored in the Azure Data Lake Storage(ADLS) or on Databricks File System using Python. Is it possible to guide me how can I achieve it? It would be great if you can share some code snippets ...

  • 20191 Views
  • 9 replies
  • 5 kudos
Latest Reply
User16764241763
Honored Contributor
  • 5 kudos

@Dhara Mandal​ Can you please try below?# cmd 1 %pip instal pandas_access   # cmd 2 import pandas_access as mdb   db_filename = '/dbfs/FileStore/Campaign_Template.mdb'   # Listing the tables. for tbl in mdb.list_tables(db_filename): print(tbl)   ...

  • 5 kudos
8 More Replies
Nachappa
by New Contributor III
  • 5578 Views
  • 4 replies
  • 6 kudos

Data model tool to connect to Databricks or Data lake?

Hi Everyone,From data modeling documentation (Dimensional/ ER Diagram), is there any tool available which can connect to databricks/ data lake and read the table structure directly and also updates the structure of table whenever there is a addition ...

  • 5578 Views
  • 4 replies
  • 6 kudos
Latest Reply
Nachappa
New Contributor III
  • 6 kudos

Hi @Kaniz Fatma​ , @Prabakar Ammeappin​ : Thanks for the reply and information. Yes, I am able to connect via DBeaver to Databricks using the JDBC and supported provided link (Sorry for delay in update as I had to try on Trial version of Enterprise D...

  • 6 kudos
3 More Replies
orion216
by New Contributor II
  • 14621 Views
  • 5 replies
  • 2 kudos

Resolved! Keep long-running notebook alive when closing browser

I am working with Azure Databricks jupyter notebooks and have time-consuming jobs (complex queries, model training, loops over many items, etc.).Every time I close the browser (or step away for a long time) of some running notebook, even before the c...

  • 14621 Views
  • 5 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hey @Eric P​ Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best? If not, please tell us so we can help you.Thanks!

  • 2 kudos
4 More Replies
Dhara
by New Contributor III
  • 2249 Views
  • 2 replies
  • 0 kudos

Access multiple .mdb files using Python

Hi, I wanted to access multiple .mdb access files which are stored in the Azure Data Lake Storage(ADLS) or on Databricks File System using Python. Can you please help me by guiding how can I do it? It would be great if you can share some code snippet...

  • 2249 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

https://community.databricks.com/s/question/0D58Y00008rCmBySAK/access-multiple-mdb-files-using-pythonmyEHtrip Employee Login

  • 0 kudos
1 More Replies
jhonw901227
by New Contributor II
  • 1029 Views
  • 0 replies
  • 0 kudos

MLflow Model Serving on Azure Databricks

I know that in the documentation about model serving says.The cluster is maintained as long as serving is enabled, even if no active model version exists. To terminate the serving cluster, disable model serving for the registered model.The cluster is...

  • 1029 Views
  • 0 replies
  • 0 kudos
Vijeth
by New Contributor II
  • 4256 Views
  • 1 replies
  • 2 kudos

Resolved! How to deploy or create mlflow model as docker image with REST api endpoint within databricks?

Is it possible to create mlflow model as a docker image with REST api endpoint and use it for inferencing within databricks or hosting the image in azure container instances?

  • 4256 Views
  • 1 replies
  • 2 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 2 kudos

@Vijeth Moudgalya​ , Hey there, we are definitely interested in making model serving easier and simpler on Databricks. There are some useful product features coming down the line - contact me at bilal dot aslam at databricks dot com if you are intere...

  • 2 kudos
Labels