- 3524 Views
- 4 replies
- 1 kudos
Hello everyone!We are currently facing an issue with a stream that is not updating new data since the 20 of July.We've validated and bronze table has data that silver doesn't have.Also seeing the logs the silver stream is running but writing 0 files....
- 3524 Views
- 4 replies
- 1 kudos
Latest Reply
Also the trigger is configured to run once, but when we start the job it never ends, it keeps in an endless loop.
3 More Replies
by
rlink
• New Contributor II
- 3361 Views
- 3 replies
- 2 kudos
Hi everyone,I create a Data Science & Engineering notebook in databricks to display some visualizations and also set up a schedule for the notebook to run every hour. I can see that the scheduled run is successful every hour, but the dashboard I crea...
- 3361 Views
- 3 replies
- 2 kudos
Latest Reply
To schedule a dashboard to refresh at a specified interval, schedule the notebook that generates the dashboard graphs.PS: Check #DAIS2023 talks
2 More Replies
by
Mado
• Valued Contributor II
- 4164 Views
- 4 replies
- 3 kudos
Hi,I have a question about DLT table. Assume that I have a streaming DLT pipeline which reads data from a Bronze table and apply transformation on data. Pipeline mode is triggered. If I re-run the pipeline, does it append new data to the current tabl...
- 4164 Views
- 4 replies
- 3 kudos
Latest Reply
@Mohammad Saber :In a Databricks Delta Lake (DLT) pipeline, when you re-run the pipeline in "append" mode, new data will be appended to the existing table. Delta Lake provides built-in support for handling duplicates through its "upsert" functionali...
3 More Replies
- 3757 Views
- 7 replies
- 0 kudos
I want to fetch new data from kinesis source for every minute. I'm using "minFetchPeriod" option and specified 60s. But this doesn't seem to be working.Streaming query: spark \ .readStream \ .format("kinesis") \ .option("streamName", kinesis_stream_...
- 3757 Views
- 7 replies
- 0 kudos
Latest Reply
Hi @Pranathi Girish Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedb...
6 More Replies
- 10710 Views
- 0 replies
- 0 kudos
I have this delta lake in ADLS to sink data through spark structured streaming. We usually append new data from our data source to our delta lake, but there are some cases when we find errors in the data that we need to reprocess everything. So what ...
- 10710 Views
- 0 replies
- 0 kudos