Hi Databricks Community,
I am looking for a solution to efficiently integrate Azure Data Factory pipeline logs with Databricks at minimal cost. Currently, I have a dashboard that consumes data from a Delta table, and I would like to augment this table with logs from Data Factory pipelines that start with the name "pip_copy".
My goal is to achieve a more comprehensive and near real-time monitoring capability. To do this, I need to send these logs to Databricks, which will then insert them into the Delta Table and dynamically update the dashboard.
Could anyone assist me with an efficient approach to achieve this? I am looking for a solution with low latency and cost-effectiveness.
Thank you in advance for your help!