To recreate the issue:
PS. Good to know: using dbt to create materialized tables is equivalent to running "create or replace table table_name"
- The following code will create an external table with row security:
create or replace table table_name using delta location 'path/to/table' with row filter func_name on (column)
- Rerunning the previous code will generate the error described above.
- Since the table is now created, if you run the following code, no error will occur (removing the location):
create or replace table table_name using delta with row filter func_name on (column)
But the table storage location and table type (external) will remain the same....
This still doesn't make sense to me, but I found a work around for this:
Since I am using dbt, I simply added a project variable row_filter, defaulting to false.
I set this variable to true, in the Databricks asset bundle task : dbt run ..... --vars '{row_filter: true, ...}' this way the variable can be set for each table seperately
Then I add the following jinja code at the beginning of my dbt models/xxx.sql model:
{% if var('row_filter', false) | string | lower == 'false' %}
{{ config(
location_root=set_location_root(parameter1, parameter2...)
) }}
{% endif %}
* set_location_root() is a self developed macro
The work around basically sets the location root if there is no row filtering happening, otherwise it will skip it.
This will not impact the tables as they were already created within the right location previously to applying the row filter.
I would still appreciate receiving feedback and a better solution to this issue 🙂
Thanks!