@Mohammad Saber :
Here's an overview of how you can set up a pipeline to send cluster metrics from Databricks to Power BI:
Configure the Databricks cluster to send logs to an Azure Event Hub or Azure Log Analytics workspace. You can do this by following the instructions in the Databricks documentation:
Create an Azure Stream Analytics job that reads the logs from the Event Hub or Log Analytics workspace, transforms the data as needed, and sends it to Power BI. You can use the Azure Stream Analytics portal to create the job and define the input, query, and output. You can find more information on how to create a Stream Analytics job in the Azure documentation:
In Power BI, create a new report that connects to the Stream Analytics output and shows the desired metrics. You can use the Power BI Desktop or the Power BI service to create the report. You can find more information on how to create a report in the Power BI documentation:
Once you have set up the pipeline, the data will flow continuously from Databricks to Power BI, and you can monitor the cluster metrics in real-time.