โ07-31-2024 08:57 AM
Suppose if we have 4 tasks (3 notebooks and 1 normal python code) in a workflow then i would like to know the cost incurred for each task in the Databricks workflow. Please let me know the any way to find out this details.
โ08-05-2024 02:38 AM
The charge is by the cluster and does not depend on what is running on it. You may be running a python notebook on 1 driver + 5 nodes, but you will still be charged for the 6 nodes that are running. If you want to reduce the code you will put such tasks to use a separate cluster but on the flip side, it will take time for each individual cluster to spin up. So the best way would probably be to have all your tasks run on the same job cluster config with autoscale on.
โ08-01-2024 10:33 PM
If each of the tasks are sharing the same cluster then no, you cannot differentiate the costs between the tasks. However, if you setup each task to have its own job cluster, then pass some custom tags and you can then differentiate/report the costs based on the custom tags. On the Azure portal you will be able to see the costs at a tag level.
โ08-05-2024 02:03 AM
Thanks for the information. One more question specific to this task cost - there are different types of tasks like Notebook, Python, Wheel, Condition task etc. There will be some code like reading xml, yaml, json etc which requires normal python code and do not want distributed parallel processing like notebook task. So in this case is there any price variation of using different types of task like notebook or python etc.
โ08-05-2024 02:38 AM
The charge is by the cluster and does not depend on what is running on it. You may be running a python notebook on 1 driver + 5 nodes, but you will still be charged for the 6 nodes that are running. If you want to reduce the code you will put such tasks to use a separate cluster but on the flip side, it will take time for each individual cluster to spin up. So the best way would probably be to have all your tasks run on the same job cluster config with autoscale on.
โ08-02-2024 05:19 AM
Hi @Prashanth24, Thank you for reaching out to our community! We're here to help you.
To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback not only helps us assist you better but also benefits other community members who may have similar questions in the future.
If you found the answer helpful, consider giving it a kudo. If the response fully addresses your question, please mark it as the accepted solution. This will help us close the thread and ensure your question is resolved.
We appreciate your participation and are here to assist you further if you need it!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group