cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sujitha
by Databricks Employee
  • 3863 Views
  • 3 replies
  • 2 kudos

Resolved! Hello Community Users,  We recently announced a new Large Language Models (LLM) program, the first of its kind on edX! Learn how to develop production...

Hello Community Users, We recently announced a new Large Language Models (LLM) program, the first of its kind on edX! Learn how to develop production-ready LLM applications and dive into the theory behind foundation models. Taught by industry experts...

  • 3863 Views
  • 3 replies
  • 2 kudos
Latest Reply
APadmanabhan
Databricks Employee
  • 2 kudos

Hi @163050 You could download the Dbc file from the course, we already have the LLM course in the Customer Academy.

  • 2 kudos
2 More Replies
BenLambert
by Contributor
  • 1107 Views
  • 1 replies
  • 1 kudos

When should you use the directory listing vs file notification

We are using Delta Live Tables for running ingestion pipelines and have come across the two options for the autoloader "file notification" vs "directory listing" this is reflected in the option cloudFiles.useIncrementalListing. We are wondering what ...

  • 1107 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Bennett Lambert​ :The choice between using "file notification" vs "directory listing" for the autoloader in Delta Live Tables depends on your specific use case and requirements. Here are some general guidelines:Use file notification if you need real...

  • 1 kudos
vittal
by New Contributor
  • 1157 Views
  • 1 replies
  • 0 kudos

Getting errors in DLT Pipeline while using ML Model

I am getting the following error when I try to run ML Models in Delta live Table Pipeline File "/local_disk0/.ephemeral_nfs/repl_tmp_data/ReplId-55c61-9b898-2c4b6-d/mlflow/envs/virtualenv_envs/mlflow-888f8c9b966409e6bddca3894244b4df9d1f94c1/lib/pyth...

  • 1157 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@Vittal Pai​  - In general, please follow the below steps for the mlflow CLI error,Step 1: set up API token and create secrets as mentioned in the below documenthttps://docs.databricks.com/machine-learning/manage-model-lifecycle/multiple-workspaces.h...

  • 0 kudos
Sujitha
by Databricks Employee
  • 1137 Views
  • 2 replies
  • 6 kudos

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week. Databricks platform release notes December ...

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week.Databricks platform release notes December 5 - 16, 2022Databricks JDBC driver 2.6.32Version 2.6.32 of the Databricks JDBC driver (download and M...

  • 1137 Views
  • 2 replies
  • 6 kudos
Latest Reply
Harun
Honored Contributor
  • 6 kudos

Thanks for sharing @Sujitha Ramamoorthy​ 

  • 6 kudos
1 More Replies
User16752245767
by Contributor
  • 919 Views
  • 0 replies
  • 5 kudos

youtu.be

I'm Avi, a Solutions Architect at Databricks working at the intersection of Data Engineering and Machine Learning.Streaming data processing has moved from niche to mainstream, and deploying machine learning models in such data streams opens up a mult...

  • 919 Views
  • 0 replies
  • 5 kudos
vaver_3
by New Contributor III
  • 15314 Views
  • 1 replies
  • 5 kudos

Resolved! ingest a .csv file with spaces in column names using Delta Live into a streaming table

How do I ingest a .csv file with spaces in column names using Delta Live into a streaming table? All of the fields should be read using the default behavior .csv files for DLT autoloader - as strings. Running the pipeline gives me an error about in...

  • 15314 Views
  • 1 replies
  • 5 kudos
Latest Reply
vaver_3
New Contributor III
  • 5 kudos

After additional googling on "withColumnRenamed", I was able to replace all spaces in column names with "_" all at once by using select and alias instead:@dlt.view( comment="" ) def vw_raw(): return ( spark.readStream.format("cloudF...

  • 5 kudos
Labels