Configuring DLT _delta_logs with Log Analytics Workspace on Job Clusters
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-18-2024 05:15 AM
Hi,
How do I configure my DLT (Delta Live Table pipeline notebook) _delta_logs with my Azure Log Analytics workspace? I'm encountering issues because the pipeline runs on a job cluster, which doesn't allow me to specify the destination of the log files under Advanced Options when creating the cluster.
Additionally, when running the pipeline on the job cluster, I encounter errors with commands like:
dbutils.secrets.get
This function is meant to retrieve secrets and configure logging. How can I ensure that my Log Analytics workspace ID and Primary Key—stored in my Azure Key Vault and saved within a Databricks scope—are correctly configured?
Thank you!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-18-2024 05:26 AM
Hi @mkEngineer,
Since we need to ensure that secrets from Azure Vault are retrieved, what is the error you are getting when running dbutils.secrets.get command?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-19-2024 04:40 AM
Hi @Alberto_Umana ,
The error I received was related to cells not being connected to the DLT pipeline, as mentioned in my other post, "Cannot run a cell when connected to the pipeline Databricks." However, after browsing the web, I realized that there were some necessary libraries that had to be downloaded and configured in order to set up a pipeline/job cluster. Though, I'm a bit unsure how to configure those libraries when editing the job compute. There is no "Library" tab on the job compute like there is for all-purpose for example. Currently comparing similar functions in Datarbicks due to the hassle of configuring to Log Analtyics.
https://learn.microsoft.com/en-us/azure/architecture/databricks-monitoring/application-logs

