Change Data Feed And Column Masks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi there,
Wondering if anyone can help me. I have had a job set up to stream from one change data feed enabled delta table to another delta table and has been executing successfully. I then added column masks to the source table from which I am streaming and get the the following error:
[UNSUPPORTED_FEATURE.TABLE_OPERATION] The feature is not supported: Table [source_table_name] does not support either micro-batch or continuous scan. Please check the current catalog and namespace to make sure the qualified table name is expected, and also check the catalog implementation which is configured by "spark.sql.catalog". SQLSTATE: 0A000
change_stream = spark.readStream.format("delta").option("readChangeFeed", "true") change_stream.table(source_table) .writeStream.option("checkpointLocation", checkpoints_location) .outputMode("append") .option("mergeSchema", False) .trigger(availableNow=True) .toTable(target_table)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi mh177,
How are you doing today?, As per my understanding, it sounds like everything was working fine until you added column masks to your source table. The error you're seeing basically means that once a table has row or column-level security policies (like masks) applied in Unity Catalog, it's no longer supported as a source for streaming reads, including Change Data Feed (CDF). Right now, Databricks doesn't allow streaming from tables that have active security policies because it can’t guarantee consistent enforcement of those rules in streaming contexts. Unfortunately, there’s no workaround to stream directly from a masked table at this time. A common workaround is to create a secured view or a copy of the data without the column mask, and stream from that instead—though I know that’s not ideal. Hopefully, support for this improves in the future, but for now, you'll need to remove the mask or stream from a version of the data without security policies applied. Let me know if you want help setting up that workaround!
Regards,
Brahma
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi @Brahmareddy ,
Thanks for the reply, just slightly confused as the documentation in the Supported features and formats section, it states the following:
- Delta Lake change data feeds are supported if the schema is compatible with the row filters and column masks that apply to the target table.
Wondering if potentially something needs to be changed in my schema to resolve this.

