cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Building a Custom Usage Dashboard using APIs for Job-Level Cost Insights

Sase
New Contributor II

Since Databricks does not provide individual cost breakdowns for components like Jobs or Compute, we aim to create a custom usage dashboard leveraging APIs to display the cost of each job run across Databricks, Azure Data Factory (ADF), or serverless environments. If anyone has experience with a similar use case or has implemented such a solution, your insights would be greatly appreciated!

5 REPLIES 5

Isi
New Contributor II

Hey! 

Databricks recently introduced system tables that provide job cost analysis, which might help achieve your goal without building a custom solution from scratch. 

These tables can offer insights into job run costs across Databricks, and you might be able to correlate them with external tools like Azure Data Factory or serverless environments by combining metadata and billing data.

You can check out the official documentation here: Databricks System Tables - Jobs Cost

I hope you find this helpful. ๐Ÿ™‚

Sase
New Contributor II

Hey, thanks for sharing the link! I did check out the system tables in the official documentation, but they donโ€™t seem to include the cost of jobs running on SQL warehouses and all-purpose compute. Iโ€™m working on a dashboard to get a complete view of all costs, including those.

Isi
New Contributor II

Hey!

Youโ€™re right; the system tables might not provide a full breakdown of costs. The approach you take to track costs also depends on the type of cluster youโ€™re using.

If youโ€™re using a Classic or Pro cluster in SQL Warehouse or all-purpose, youโ€™ll likely need to create a custom dashboardโ€” for example, using Grafana. This can pull cost data directly from your cloud provider (AWS, Azure, GCP) to track the compute machine costs and combine it with Databricks-specific costs such as DBU usage and storage.

Hope that helps! ๐Ÿ™‚

 

Sase
New Contributor II

Thanks. Any recommendations on how Grafana can retrieve job, serverless, and SQL warehouse cost data from Azure? Are there specific APIs or other mechanisms available for extracting cost information?

Isi
New Contributor II

Hey,

Yes, I am not Azure expert but, Databricks REST API can help you extract usage data for serverless resources, allowing you to integrate this information into custom dashboards or external tools like Grafana.

On the Azure side, costs related to will appear under the Databricks resource category in Azure Cost Management + Billing. However, Azure does not break down Databricks-managed workloads (like serverless SQL warehouses) into specific line items, as Databricks abstracts the underlying infrastructure. Youโ€™ll see an aggregate charge for Databricks usage.

You can configure Azure Cost Management to export aggregated Databricks costs to a storage account or a Log Analytics Workspace. This exported data can then be combined with usage data from the Databricks API to build a more detailed view of costs.

๐Ÿ™‚

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group