cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

yopbibo
by Contributor II
  • 2404 Views
  • 3 replies
  • 5 kudos

Deploy a ML model, trained and registered in Databricks to AKS

Hi,I can train, registered a ML Model in my Datbricks Workspace.Then, to deploy it on AKS, I need to register the model in Azure ML, and then, deploy to AKS.Is it possible to skip the Azure ML step?I would like to deploy directly into my AKS instance...

  • 2404 Views
  • 3 replies
  • 5 kudos
Latest Reply
sidharthpradhan
New Contributor II
  • 5 kudos

Is it still the case, can't we serve the model in Databricks. I am new to this, so I am just wondering the capabilities.

  • 5 kudos
2 More Replies
Badarla
by New Contributor
  • 4060 Views
  • 2 replies
  • 1 kudos

Customize mail notification from Databricks workflow

Hi All,Can we customize the mail subject and body that we receive from Azure Databricks workflow upon failure jobs? Kindly help me, if we can do so.Thanks,Moshe

  • 4060 Views
  • 2 replies
  • 1 kudos
Latest Reply
np75
New Contributor II
  • 1 kudos

I have three workspaces and the alerts sent by the jobs running are not referencing the workspace for example. So if I run the job to dev environemnt I get an alert like if the job has been executed from the prod. This si a huge issue for our admins....

  • 1 kudos
1 More Replies
db_noob
by New Contributor II
  • 3818 Views
  • 4 replies
  • 8 kudos

Azure - Databricks - account storage gen 2

Hello Every one, i am really new to databricks, just passed my apache developer certification on it.i also have a certification on data engineering with Azure.some fancy words here but i only started doing real deep work on them as i started a person...

  • 3818 Views
  • 4 replies
  • 8 kudos
Latest Reply
Debayan
Databricks Employee
  • 8 kudos

Hi,If we go by the error , Invalid configuration value detected for fs.azure.account.keyStorage account access key to access data using the abfssprotocol cannot be used. Please refer this https://learn.microsoft.com/en-us/azure/databricks/storage/azu...

  • 8 kudos
3 More Replies
Saurabh707344
by New Contributor III
  • 3024 Views
  • 3 replies
  • 2 kudos

Resolved! ML usecase feasibility for Databricks ML Vs AWS Sagemaker/Azure ML

What complexity of ML models are feasible to be created in Databricks ML and further that we have to rely on AWS Sagamaker or Azure ML ?Do we have clear segragation around it by ML usecases ?

  • 3024 Views
  • 3 replies
  • 2 kudos
Latest Reply
corvo
New Contributor II
  • 2 kudos

In Databricks, your usecase can be solved by the notebooks provided here in databricks. There is no dependency on AWS sagemaker directly. All the model traiing and deployement that can be done in sagemaker, is supported via databricks as well. 

  • 2 kudos
2 More Replies
Mado
by Valued Contributor II
  • 5696 Views
  • 2 replies
  • 1 kudos

Error "Invalid configuration value detected for fs.azure.account.key" when listing files stored in an Azure Storage account using "dbutils.fs.ls"

I get the following error when getting a list of files stored in an Azure Storage account using "dbutils.fs.ls" command in Databrciks.Failure to initialize configuration for storage account AAAAA.dfs.core.windows.net: Invalid configuration value dete...

0 1 2 6
  • 5696 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Mohammad Saber​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
1 More Replies
PGrover
by New Contributor II
  • 1565 Views
  • 1 replies
  • 2 kudos

Connecting to Synapse database using AzureCliCredential token in Spark

I want to connect to my Azure Synapse database using Spark. I can do this in pyodbc no problem but that is not what I want.Here is how I get my credentialscredential = AzureCliCredential() databaseToken = credential.get_token('https://database.window...

  • 1565 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Patrick Grover​ We haven't heard from you since the last response from @Kaniz Fatma​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to ot...

  • 2 kudos
ChrisS
by New Contributor III
  • 11957 Views
  • 14 replies
  • 5 kudos

Resolved! How do you get data from Azure Data Lake Gen 2 Mounted or Imported and Exported from Databricks?

The example that data bricks gives is not helpful and does not tell me exactly what I need to do. I am new to this and not sure what I need to do in azure to get this done. I just need to be able to pull data and write data to the data containers. Be...

  • 11957 Views
  • 14 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Chris Sarrico​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 5 kudos
13 More Replies
aranyics
by New Contributor
  • 924 Views
  • 1 replies
  • 1 kudos

Is it possible to start Databricks AutoML experiment remotely? (Azure Databricks)

Currently I am using Azure Machine Learning Studio for my work, and would like to compare performance of Azure and Databricks automl algorithms. Is it possible to write a notebook in Azure to start the automl algorithm in Databricks? My data is found...

  • 924 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Csaba Aranyi​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
Saurabh707344
by New Contributor III
  • 1246 Views
  • 1 replies
  • 1 kudos

Resolved! Comparative study of Azure Databricks MLOps capabilities in conjuction with Azuredevops, GIT, Jenkins

Looking for Comparative study of capabilities of below tools combination. In what situation I should use which of the below combination for MLOps project?a) Azure Databricks MLb) Azure Databricks ML + Azure Devops + GITc) Azure Databricks ML + Jenkin...

  • 1246 Views
  • 1 replies
  • 1 kudos
Latest Reply
shyam_9
Databricks Employee
  • 1 kudos

Hi @saurabh707344, you can use Azure Databricks ML when you're in the initial stages and developing some POCs. The other tools you mentioned were used based on your usecase when you moved some of the models to production and actively developing and ...

  • 1 kudos
isaac_gritz
by Databricks Employee
  • 4494 Views
  • 1 replies
  • 3 kudos

Resolved! Pricing on Databricks

How Pricing Works on DatabricksI highly recommend checking out this blog post on how databricks pricing works from my colleague @MENDELSOHN CHAN​Databricks has a consumption based pricing model, so you pay only for the compute you use.For interactive...

  • 4494 Views
  • 1 replies
  • 3 kudos
Latest Reply
Meag
New Contributor III
  • 3 kudos

I read the read blog you will share it helps thanks for sharing.

  • 3 kudos
Anonymous
by Not applicable
  • 1257 Views
  • 2 replies
  • 3 kudos

www.databricks.com

Hello Dolly: Democratizing the magic of ChatGPT with open modelsDatabricks has just released a groundbreaking new blog post exploring ChatGPT, an open-source language model with the potential to transform the way we interact with technology. From cha...

  • 1257 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Lets get candid! Let me know your initial thoughts about LLM Models, ChatGpt, Dolly.

  • 3 kudos
1 More Replies
Hariprasad94
by New Contributor II
  • 5126 Views
  • 3 replies
  • 0 kudos

How to call a python function from displayHTML javascript code?

python - How to use IPython.notebook.kernel.execute in Azure databricks? - Stack OverflowIn standard jupyter notebook, we could use IPython.notebook.kernel.execute to call a python function, in Azure databricks IPython seems to be not exposed in brow...

  • 5126 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@hariprasad T​ :In Azure Databricks, which is a cloud-based service for Apache Spark and big data processing, the notebook environment does not expose IPython directly in the browser DOM global scope as it is done in standard Jupyter notebooks. Howev...

  • 0 kudos
2 More Replies
Kaan
by New Contributor
  • 2753 Views
  • 1 replies
  • 1 kudos

Resolved! Using databricks in multi-cloud, and querying data from the same instance.

I'm looking for a good product to use across two clouds at once for Data Engineering, Data modeling and governance. I currently have a GCP platform, but most of my data and future data goes through Azure, and currently is then transfered to GCS/BQ.Cu...

  • 2753 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Karl Andrén​ :Databricks is a great option for data engineering, data modeling, and governance across multiple clouds. It supports integrations with multiple cloud providers, including Azure, AWS, and GCP, and provides a unified interface to access ...

  • 1 kudos
User16461610613
by New Contributor II
  • 2205 Views
  • 1 replies
  • 2 kudos

Free Databricks Training on AWS, Azure, or Google Cloud Good news! You can now access free, in-depth Databricks training on AWS, Azure or Google Cloud...

Free Databricks Training on AWS, Azure, or Google CloudGood news! You can now access free, in-depth Databricks training on AWS, Azure or Google Cloud.  Our on-demand training series walks through how to:Streamline data ingest and management to build ...

image
  • 2205 Views
  • 1 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 2 kudos

Thank you for sharing this!!

  • 2 kudos
Charley
by New Contributor II
  • 6740 Views
  • 1 replies
  • 1 kudos

error status 400 calling serving model endpoint invocation using personal access token on Azure Databricks

Hi all, I've deployed a model, moved it to production and served it (mlflow), but when testing it in the python notebook I get a 400 error. code/details below:import osimport requestsimport jsonimport pandas as pdimport numpy as np# Create two record...

  • 6740 Views
  • 1 replies
  • 1 kudos
Latest Reply
nakany
New Contributor II
  • 1 kudos

data_json in the score_model function should be defined as followsds_dict = {"dataframe_split": dataset.to_dict(orient='split')} if isinstance(dataset, pd.DataFrame) else create_tf_serving_json(dataset)

  • 1 kudos
Labels