Not that I know of.
Google uses number of bytes read to determine the cost.
Databricks uses DBU. The number of DBU's spent is not only dependent on the amount of bytes read (the more you read, the longer the program will run probably), but also the type of VM used.
Then there is also autoscaling which makes it harder to predict a price.
Also the total cost is not only DBU but also the provisioning cost of the VMs.
So that makes it pretty hard to predict a cost.
It would of course be very cool to have such a prediction.
โ