cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Enzo_Bahrami
by New Contributor III
  • 2654 Views
  • 2 replies
  • 0 kudos

Resolved! Input File Path from Autoloader in Delta Live Tables

Hello everyone!I was wondering if there is any way to get the subdirectories in which the file resides while loading while loading using Autoloader with DLT. For example:def customer(): return (  spark.readStream.format('cloudfiles')    .option('clou...

  • 2654 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Parsa Bahraminejad​ We haven't heard from you since the last response from @Vigneshraja Palaniraj​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be...

  • 0 kudos
1 More Replies
goldentown
by New Contributor III
  • 2948 Views
  • 1 replies
  • 2 kudos

Resolved! The Jupiter note-book doesn't update imports after updating the .py file

Please help. Here's an example:I have one .py file and one .ipynb, and the .py file contains the test function, but after adding the new function test1, it doesn't appear in .ipynb. Even after re-running the .py file and reimporting it in .ipynb. How...

  • 2948 Views
  • 1 replies
  • 2 kudos
Latest Reply
goldentown
New Contributor III
  • 2 kudos

%load_ext autoreload%autoreload 2

  • 2 kudos
andreiten
by New Contributor II
  • 3893 Views
  • 1 replies
  • 3 kudos

Is there any example or guideline how to pass JSON parameters to the pipeline in Databricks workflow?

I used this source https://docs.databricks.com/workflows/jobs/jobs.html#:~:text=You%20can%20use%20Run%20Now,different%20values%20for%20existing%20parameters.&text=next%20to%20Run%20Now%20and,on%20the%20type%20of%20task. But there is no example of how...

  • 3893 Views
  • 1 replies
  • 3 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 3 kudos

Hi @Andre Ten​ That's exactly how you specify the json parameters in databricks workflow. I have been doing in the same format and it works for me..removed the parameters as it is a bit sensitive. But I hope you get the point.Cheers.

  • 3 kudos
marco_almeida
by New Contributor II
  • 1208 Views
  • 2 replies
  • 2 kudos

I can't import an library like the example

I read this article and I created a notebook to use like a library but when I tried to import it in other notebook I received this error: No module named 'lib.lib_test' No module named 'lib.lib_*****'

  • 1208 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hey @Marco Antônio de Almeida Fernandes​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love...

  • 2 kudos
1 More Replies
merca
by Valued Contributor II
  • 7645 Views
  • 12 replies
  • 4 kudos

Value array {{QUERY_RESULT_ROWS}} in Databricks SQL alerts custom template

Please include in documentation an example how to incorporate the `QUERY_RESULT_ROWS` variable in the custom template.

  • 7645 Views
  • 12 replies
  • 4 kudos
Latest Reply
jose_gonzalez
Moderator
  • 4 kudos

Hi @Merca Ovnerud​ ,Here is the docs link https://docs.databricks.com/sql/user/alerts/index.html please let me know if this helps or you still have more follow questions.

  • 4 kudos
11 More Replies
sarvesh
by Contributor III
  • 5071 Views
  • 4 replies
  • 7 kudos

Resolved! Can we use spark-stream to read/write data from mysql? I can't find an example.

If someone can link me an example where stream is used to read or write to mysql please do.

  • 5071 Views
  • 4 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 7 kudos

Regarding writing (sink) is possible without problem via foreachBatch .I use it in production - stream autoload csvs from data lake and writing foreachBatch to SQL (inside foreachBatch function you have temporary dataframe with records and just use w...

  • 7 kudos
3 More Replies
Labels