- 11356 Views
- 13 replies
- 35 kudos
I have a databricks job running in azure databricks. A similar job is also running in databricks gcp. I would like to compare the cost. If I assign a custom tag to the job cluster running in azure databricks, I can see the cost incurred by that job i...
- 11356 Views
- 13 replies
- 35 kudos
Latest Reply
In Azure, you can use Cost Management to track your expenses incurred by Databricks instance.
12 More Replies
- 4899 Views
- 2 replies
- 4 kudos
Hi, is there a way to find out/monitor which users has used my cluster, how long and how many times in an azure databricks workspace ?
- 4899 Views
- 2 replies
- 4 kudos
Latest Reply
Hello, You can activate Audit logs ( More specifically Cluster logs) https://learn.microsoft.com/en-us/azure/databricks/administration-guide/account-settings/azure-diagnostic-logs It can be very helpful to track all the metrics.
1 More Replies
by
AP
• New Contributor III
- 4604 Views
- 5 replies
- 3 kudos
So databricks gives us great toolkit in the form optimization and vacuum. But, in terms of operationaling them, I am really confused on the best practice.Should we enable "optimized writes" by setting the following at a workspace level?spark.conf.set...
- 4604 Views
- 5 replies
- 3 kudos
Latest Reply
@AKSHAY PALLERLA Just checking in to see if you got a solution to the issue you shared above. Let us know!Thanks to @Werner Stinckens for jumping in, as always!
4 More Replies
- 2211 Views
- 1 replies
- 0 kudos
What timezone is the “timestamp” value in the Databricks Usage log ?Is it UTC?timestamp2020-12-01T00:59:59.000ZNeed to match this to AWS Cost Explorer timezone for simplicity.It's UTC.Please see timestamp under Audit Log Schema https://docs.databrick...
- 2211 Views
- 1 replies
- 0 kudos
- 1680 Views
- 1 replies
- 1 kudos
Why do we need the ec2:CreateTags and ec2:DeleteTags permissions?Are they required?Are ec2 tags used internally as well?
- 1680 Views
- 1 replies
- 1 kudos
Latest Reply
Yes, it’s required. It’s how Databrics tracks and tags resources.The tags are used to identify the owner of clusters on the AWS side and Databricks uses the tag information internally as well.
- 1296 Views
- 1 replies
- 0 kudos
If users are allowed to create clusters, how can an operations team prevent them from consuming excessive costs?
- 1296 Views
- 1 replies
- 0 kudos
Latest Reply
Cluster policies can be used to constrain the node types that are available to users to create clusters, the number of nodes they can use to create clusters and the max DBU consumption they can use.The following resources provide further information:...