Data factory logs into databricks delta table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-18-2024 08:05 AM
Hi Databricks Community,
I am looking for a solution to efficiently integrate Azure Data Factory pipeline logs with Databricks at minimal cost. Currently, I have a dashboard that consumes data from a Delta table, and I would like to augment this table with logs from Data Factory pipelines that start with the name "pip_copy".
My goal is to achieve a more comprehensive and near real-time monitoring capability. To do this, I need to send these logs to Databricks, which will then insert them into the Delta Table and dynamically update the dashboard.
Could anyone assist me with an efficient approach to achieve this? I am looking for a solution with low latency and cost-effectiveness.
Thank you in advance for your help!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-19-2024 09:55 AM
Hello, could you tell me how Databricks would be able to automatically retrieve events from the event hub? without the need for the cluster to always be on? And if you could send step-by-step screenshots of the Azure and Databricks configuration it would help me a lot