I've created a view with row level access based on CURRENT_RECIPIENT() function in the where clause. And I have 100s of clients as recipients that query this view.
The problem is, when I modify this view CREATE OR REPLACE with a new sql code, and recipient tries to get data from this view it gets old cached data from DSFF layer materialized internal views, created during the previous requests. I see in the logs that databricks drops that specific after 8 hours.
__dsff_materialization_common_view_7383f853e81c4b3e9405acb3d360034c
I don't see these internal views in UC.
1. How can I refresh or drop this cache as soon as I CREATE OR REPLACE the view, so recipient always get up to date data?
2. Also how can I estimate the cost of these serverless pipeline or cluster runs, that are triggered when recipient pull the data? I don't see it in system.billing.usage there is no specific sku_name for this serverless activities, even more there are no anu billing rows for this delta sharing serverless runs.
3. For which services should I pay, for serverless compute or for data stored/cached somewhere in databricks plane? Is there any pricing for deltasharing views assosiated resources? I was not able to find in documentation.