cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks workflow each task cost

Prashanth24
New Contributor III

Suppose if we have 4 tasks (3 notebooks and 1 normal python code) in a workflow then i would like to know the cost incurred for each task in the Databricks workflow. Please let me know the any way to find out this details.

1 ACCEPTED SOLUTION

Accepted Solutions

The charge is by the cluster and does not depend on what is running on it. You may be running a python notebook on 1 driver + 5 nodes, but you will still be charged for the 6 nodes that are running. If you want to reduce the code you will put such tasks to use a separate cluster but on the flip side, it will take time for each individual cluster to spin up. So the best way would probably be to have all your tasks run on the same job cluster config with autoscale on. 

View solution in original post

4 REPLIES 4

Edthehead
Contributor

If each of the tasks are sharing the same cluster then no, you cannot differentiate the costs between the tasks.  However, if you setup each task to have its own job cluster, then pass some custom tags and you can then differentiate/report the costs based on the custom tags. On the Azure portal you will be able to see the costs at a tag level. 

Thanks for the information. One more question specific to this task cost - there are different types of tasks like Notebook, Python, Wheel, Condition task etc. There will be some code like reading xml, yaml, json etc which requires normal python code and do not want distributed parallel processing like notebook task. So in this case is there any price variation of using different types of task like notebook or python etc.

The charge is by the cluster and does not depend on what is running on it. You may be running a python notebook on 1 driver + 5 nodes, but you will still be charged for the 6 nodes that are running. If you want to reduce the code you will put such tasks to use a separate cluster but on the flip side, it will take time for each individual cluster to spin up. So the best way would probably be to have all your tasks run on the same job cluster config with autoscale on. 

Kaniz_Fatma
Community Manager
Community Manager

Hi @Prashanth24, Thank you for reaching out to our community! We're here to help you.

To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback not only helps us assist you better but also benefits other community members who may have similar questions in the future.

If you found the answer helpful, consider giving it a kudo. If the response fully addresses your question, please mark it as the accepted solution. This will help us close the thread and ensure your question is resolved.

We appreciate your participation and are here to assist you further if you need it!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group