"message": " File <command-68719476741>, line 10\n log_analytics_pkey = dbutils.secrets.get(scope=\"ScopeLogAnalyticsPKey\", key=\"LogAnalyticsPKey\")\n ^\nSyntaxError: invalid syntax\n", "error_class": "_UNCLASSIFIED_PYTHON_COMMAND_ERROR"
It seems odd that this configuration has to be handled at the command-line level. Could you guide me further on how to set up this configuration, given that it doesnโt work in the notebook? Specifically, is there a way to configure the secrets directly in the JSON settings or the DLT UI Advanced Configuration?
"message": " File <command-68719476741>, line 10\n log_analytics_pkey = dbutils.secrets.get(scope=\"ScopeLogAnalyticsPKey\", key=\"LogAnalyticsPKey\")\n ^\nSyntaxError: invalid syntax\n", "error_class": "_UNCLASSIFIED_PYTHON_COMMAND_ERROR"
For example, could I pass my two secrets (Log Analytics Workspace ID and Log Analytics Primary Key, stored in Key Vault) as key-value pairs under Advanced Configuration? How does the Scope I jsuit created com into play here? Or is that section only for secrets created in the CLIโs secret scope?
Simply put, can I use the Advanced Configuration (Key-Value pairs) to set these secrets and avoid reliance on code-based retrieval?
On another note, how can I verify that cluster logging is enabled? Besides checking the Logs and Metrics sections in the DLT Pipeline UI under Compute/Clusters, is there another way to ensure that logging and metrics are correctly captured? Those tabs a re enbaled but that is not enough for confirming enablement.
Also, I noticed in the Compute page under Advanced Options the setting:
"When a user runs a command on a cluster with Credential Passthrough enabled, that user's Azure Active Directory credentials will be automatically passed through to Spark, allowing them to access data in Azure Data Lake Storage Gen1 and Gen2 without having to manually specify their credentials."
However, Iโm unable to change the destination from "None." Could this be related to Unity Catalog being enabled? If so, does Unity Catalog impose restrictions on credential passthrough or how secrets are managed with Azure Key Vault?
Lastly, when running the notebooks for DLT, I noticed that a fourth tab briefly appears at the bottom of the page (next to DLT Graph, DLT Event Log, and DLT Query History) called Pipeline Logs, but it disappears after about a second.
I suspect my Azure Monitor setup is mostly correct, but it seems like the logs are not being routed to the Log Analytics Workspace. Can you confirm if the route for logs should be explicitly set elsewhere, or if thereโs an issue with the configuration itself?
Thanks again for your help!