cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How do I compare cost between databricks gcp and azure databricks ?

Tahseen0354
Valued Contributor

I have a databricks job running in azure databricks. A similar job is also running in databricks gcp. I would like to compare the cost. If I assign a custom tag to the job cluster running in azure databricks, I can see the cost incurred by that job in azure cost analysis dashboard. How do I see the same in GCP ? What is a good way to do something like that ?

15 REPLIES 15

Abishek
Valued Contributor

Hello @Md Tahseen Anam​ 

For GCP Can you please check the below links for usage deatils :

View billable usage using the account console

https://docs.gcp.databricks.com/administration-guide/account-settings-gcp/usage.html

Analyze billable usage log data :

https://docs.gcp.databricks.com/administration-guide/account-settings/usage-analysis.html

 https://www.databricks.com/product/gcp-pricing/instance-types

Tahseen0354
Valued Contributor

Thanks! I think the following link can be very useful:

Analyze billable usage log data :

https://docs.gcp.databricks.com/administration-guide/account-settings/usage-analysis.html

Is there a similar way in azure to analyze the usage log data from the usage log file ? Can you share any similar link for azure ?

Abishek
Valued Contributor

Kaniz_Fatma
Community Manager
Community Manager

Hi @Md Tahseen Anam​ ​, We haven’t heard from you since the last response from @Abishek Subramanian​, and I was checking back to see if you have a resolution yet.

If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Thank you for your reply. The links for azure shows the pricing details. I was looking for a way to analyze the cost associated with jobs and clusters both in azure and gcp so that we can easily compare them. Seems like the way we analyze cost in gcp is different to how we do it in azure. Would be nice to know if we can use a cost log file to create a notebook dashboard to analyze the cost in azure databricks.

Dicer
Valued Contributor

For Azure., you can go to cost management to check your expense.

Tahseen0354
Valued Contributor

Thank you for your reply.

Kaniz_Fatma
Community Manager
Community Manager

Hi @Md Tahseen Anam​ ​, We haven’t heard from you since the last response from @Cheuk Hin Christophe Poon​ , and I was checking back to see if you have a resolution yet.

If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

I do not have any other solution other than those mentioned above. I have already explored them. But I was actually looking for something like an auto generated report to see the cost incurred by each individual job, user and cluster in each workspace for a period of time. Imagine there is a new user who has just started using databricks. How will that user know how much cost he/she is incurring by his/her each activity in the workspace from the very beginning without any help from some experienced user?

Dicer
Valued Contributor

@Md Tahseen Anam​ I haven't used GCP, but there is a Cost Management tool in GCP

https://cloud.google.com/blog/topics/cost-management/cost-management-tools-in-google-cloud-console

Hubert-Dudek
Esteemed Contributor III

Additionally, it is two completely different systems, and most importantly, in Azure, databricks are natively integrated, so in Azure, you take everything from cost management.

DBU is similar in cost and can be easy compared. Regarding all other expenses (VMs, storage) you need to take from cost reports.

Anonymous
Not applicable

Hi @Md Tahseen Anam​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Anonymous
Not applicable

Hello,

Databricks on Google Cloud

Open. Built on open standards, open APIs and open infrastructure so you can access, process and analyze data on your terms.

Optimized. Deploy Data bricks on Google Kubernetes Engine, the first Kubernetes-based Data bricks runtime on any cloud, to get insights faster.

Integrated.

any one can post regarding it .

Felix
New Contributor III

Hi @Md Tahseen Anam​ , have you had a look at https://databrickslabs.github.io/overwatch/gettingstarted/ before? As of now it's only available on AWS and Azure, but I think this is the kind of granularity you're looking for. In general, you need to factor in the costs for DBU and cloud provider compute - unless you want to go really deep and consider secondary costs like storage, v-nets, ... as well. In your specific case I would recommend to give each user their respective cluster (or personal compute to keep costs low) and tag these respectively. You can then aggregate over the tags via the standard cost monitoring solutions on Azure and GCP. Once the serverless offering on the Databricks platform includes workflows and general compute as well, this will provide you with additional insight and opportunity to control cost.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group