- 1449 Views
- 2 replies
- 0 kudos
Hello everyone!I was wondering if there is any way to get the subdirectories in which the file resides while loading while loading using Autoloader with DLT. For example:def customer(): return ( spark.readStream.format('cloudfiles') .option('clou...
- 1449 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Parsa Bahraminejad​ We haven't heard from you since the last response from @Vigneshraja Palaniraj​ ​, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be...
1 More Replies
- 1815 Views
- 1 replies
- 2 kudos
Please help. Here's an example:I have one .py file and one .ipynb, and the .py file contains the test function, but after adding the new function test1, it doesn't appear in .ipynb. Even after re-running the .py file and reimporting it in .ipynb. How...
- 1815 Views
- 1 replies
- 2 kudos
Latest Reply
%load_ext autoreload%autoreload 2
- 2435 Views
- 1 replies
- 3 kudos
I used this source https://docs.databricks.com/workflows/jobs/jobs.html#:~:text=You%20can%20use%20Run%20Now,different%20values%20for%20existing%20parameters.&text=next%20to%20Run%20Now%20and,on%20the%20type%20of%20task. But there is no example of how...
- 2435 Views
- 1 replies
- 3 kudos
Latest Reply
Hi @Andre Ten​ That's exactly how you specify the json parameters in databricks workflow. I have been doing in the same format and it works for me..removed the parameters as it is a bit sensitive. But I hope you get the point.Cheers.
- 808 Views
- 2 replies
- 2 kudos
I read this article and I created a notebook to use like a library but when I tried to import it in other notebook I received this error: No module named 'lib.lib_test' No module named 'lib.lib_*****'
- 808 Views
- 2 replies
- 2 kudos
Latest Reply
Hey @Marco Antônio de Almeida Fernandes​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love...
1 More Replies
by
merca
• Valued Contributor II
- 4674 Views
- 12 replies
- 3 kudos
Please include in documentation an example how to incorporate the `QUERY_RESULT_ROWS` variable in the custom template.
- 4674 Views
- 12 replies
- 3 kudos
Latest Reply
Hi @Merca Ovnerud​ ,Here is the docs link https://docs.databricks.com/sql/user/alerts/index.html please let me know if this helps or you still have more follow questions.
11 More Replies
- 3232 Views
- 4 replies
- 7 kudos
If someone can link me an example where stream is used to read or write to mysql please do.
- 3232 Views
- 4 replies
- 7 kudos
Latest Reply
Regarding writing (sink) is possible without problem via foreachBatch .I use it in production - stream autoload csvs from data lake and writing foreachBatch to SQL (inside foreachBatch function you have temporary dataframe with records and just use w...
3 More Replies