cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Change Data Feed And Column Masks

mh177
New Contributor

Hi there,

Wondering if anyone can help me. I have had a job set up to stream from one change data feed enabled delta table to another delta table and has been executing successfully. I then added column masks to the source table from which I am streaming and get the the following error:

 [UNSUPPORTED_FEATURE.TABLE_OPERATION] The feature is not supported: Table [source_table_name] does not support either micro-batch or continuous scan. Please check the current catalog and namespace to make sure the qualified table name is expected, and also check the catalog implementation which is configured by "spark.sql.catalog". SQLSTATE: 0A000

 

change_stream = spark.readStream.format("delta").option("readChangeFeed", "true")

change_stream.table(source_table)
.writeStream.option("checkpointLocation", checkpoints_location)
.outputMode("append")
.option("mergeSchema", False)
.trigger(availableNow=True)
.toTable(target_table)
 
1 REPLY 1

Brahmareddy
Honored Contributor II

Hi mh177,

How are you doing today?, As per my understanding, it sounds like everything was working fine until you added column masks to your source table. The error you're seeing basically means that once a table has row or column-level security policies (like masks) applied in Unity Catalog, it's no longer supported as a source for streaming reads, including Change Data Feed (CDF). Right now, Databricks doesn't allow streaming from tables that have active security policies because it can’t guarantee consistent enforcement of those rules in streaming contexts. Unfortunately, there’s no workaround to stream directly from a masked table at this time. A common workaround is to create a secured view or a copy of the data without the column mask, and stream from that instead—though I know that’s not ideal. Hopefully, support for this improves in the future, but for now, you'll need to remove the mask or stream from a version of the data without security policies applied. Let me know if you want help setting up that workaround!

Regards,

Brahma

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now