Hi @Raman_Unifeye ,
The thing is Databricks pricing is based on your compute usage.Storage, networking and related costs will vary depending on the services you choose and your cloud service provider.
I think you won't find such a tool because every workload is different. For example, processing a table that has hundreds of millions of rows can vary significantly between two data pipelines. In pipeline A, you may have very complex transformations, and the time spent computing them will greatly affect the DBU cost (compute usage).
Meanwhile, pipeline B may simply take the data and perform a straightforward insert without any transformations. The cost of such a pipeline will be much lower, even though the amount of data processed is similar.
What Iโm trying to say is that you wonโt find a tool that can reliably estimate DBU cost based solely on data volume. By understanding your environments and transformations, you can try to estimate it yourself, but you wonโt find a generic solution that will accurately calculate it for you.