- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2023 03:09 PM
Hi,
I have two questions about cluster logs.
Q1) It seems that I can only store logs on DBFS. But, is there any way I can send the cluster logs to an Azure storage account?
Note that workspace is not enabled for the Unity Catalog (and is not aimed to be enabled based on my client's request.).
Q2) We can query tables in the notebook running on the cluster (different from the query editor in SQL persona).
I am interested in table usage information (which user queried a specific table at what time).
Do the logs contain table usage information?
- Labels:
-
Clusterlogs
-
Databricks Table Usage
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-23-2023 01:33 AM
Q1: if you mount your storage account on databricks, it is available through dbfs (dbfs/mnt/...).
Q2: I think you need to enable the audit logs for that, although I am not sure.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-23-2023 01:33 AM
Q1: if you mount your storage account on databricks, it is available through dbfs (dbfs/mnt/...).
Q2: I think you need to enable the audit logs for that, although I am not sure.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-23-2023 10:31 PM
Hi @Mohammad Saber
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!