- 4392 Views
- 4 replies
- 11 kudos
Tip: These steps are built out for AWS accounts and workspaces that are using Delta Lake. If you would like to learn more watch this video and reach out to your Databricks sales representative for more information.Step 1: Create your own notebook or ...
- 4392 Views
- 4 replies
- 11 kudos
by
BigJay
• New Contributor II
- 5499 Views
- 5 replies
- 5 kudos
If I run some code, say for an ETL process to migrate data from bronze to silver storage, when a cell executes it reports num_affected_rows in a table format. I want to capture that and log it in my logger. Is it stored in a variable or syslogged som...
- 5499 Views
- 5 replies
- 5 kudos
Latest Reply
Good one Dan! I never thought of using the delta api for this but there you go.
4 More Replies
- 1283 Views
- 1 replies
- 0 kudos
I am little confused between what to use between structured stream(trigger once) and etl batch jobs, can I get help here on which basis i should make my decision.
- 1283 Views
- 1 replies
- 0 kudos
Latest Reply
In Structured Streaming, triggers are used to specify how often a streaming query should produce results. A RunOnce trigger will fire only once and then will stop the query - effectively running it like a batch job.Now, If your source data is a strea...