โ01-20-2026 08:24 PM
Hi,
We are seeing this error and does anyone know the ways to fix it?
[RequestId=6cf95ee8-f312-47cd-846c-dcd87158c939 ErrorClass=INVALID_PARAMETER_VALUE.ROW_COLUMN_ACCESS_POLICIES_NOT_SUPPORTED_ON_ASSIGNED_CLUSTERS] Query on table with row filter or column mask not supported on assigned clusters.
โ01-21-2026 05:51 AM
Check the DBR version of the cluster / compute you are using to query this table .
Similar issue from recent past - https://kb.databricks.com/unity-catalog/analysisexception-error-when-trying-to-execute-sparkcatalogt...
โ01-21-2026 07:02 AM
Hi @Harish_Kumar_M ,
The error message you are encountering occurs when you try to query any table with Column masking/row filtering enabled in a single-user compute environment with DBR 15.3 or below. Currently, there is limited support for querying tables via single-user compute. You will be able to do only the following commands as follows:
DBR 15.4โ16.2: Readโonly (SELECT). DML writes arenโt supported.
DBR 16.3+: MERGE and append writes are supported; others (e.g., simple INSERT/OVERWRITE) are not. If youโre overwriting, switch to MERGE/append or use Shared/Serverless.
Ref Doc - https://docs.databricks.com/aws/en/compute/single-user-fgac#limitations
If you want a quick fix, run the commands on Shared access mode compute (standard cluster) or a SQL Warehouse (Pro/Serverless) instead of an assigned/singleโuser cluster. These environments support FGAC operations without the assignedโcluster limitations.
โ02-09-2026 03:18 AM
2 option either use Access mode shared
or access mode single_user with (DBR 15.4+ a dedicated cluster will use serverless for read and writes )
Br
โ03-08-2026 06:01 PM
Hi @Harish_Kumar_M,
The error you are seeing, "Query on table with row filter or column mask not supported on assigned clusters" (INVALID_PARAMETER_VALUE.ROW_COLUMN_ACCESS_POLICIES_NOT_SUPPORTED_ON_ASSIGNED_CLUSTERS), occurs because your cluster's access mode and configuration do not meet the requirements for fine-grained access control (row filters and column masks).
Here is what you need to check and resolve:
UNDERSTANDING THE ROOT CAUSE
Row filters and column masks in Unity Catalog require specific compute configurations. The "assigned clusters" term in the error refers to clusters running in Dedicated access mode (previously called "Single User" or "Assigned" mode). On Dedicated access mode clusters, Databricks uses serverless compute behind the scenes to enforce fine-grained access controls. If that enforcement path is not available, you get this error.
OPTION 1: SWITCH TO SHARED (STANDARD) ACCESS MODE
The simplest fix is to use a cluster with Shared access mode (now called "Standard" access mode). Standard access mode has built-in support for Unity Catalog governance features including row filters and column masks, with no additional requirements.
To change your cluster access mode:
1. Go to Compute in the sidebar
2. Select your cluster and click Edit
3. Under Access mode, choose Standard
4. Save and restart the cluster
If you are using a SQL warehouse, SQL warehouses natively support row filters and column masks with no additional configuration needed.
OPTION 2: KEEP DEDICATED ACCESS MODE (WITH REQUIREMENTS)
If you need Dedicated access mode (for example, for R language support or ML libraries), you must meet all of the following requirements:
1. Your workspace must have serverless compute enabled. Contact your workspace admin to verify this. An admin can check under Admin Settings > Workspace Settings and confirm serverless compute is enabled for the workspace.
2. Your cluster must be running Databricks Runtime 15.4 LTS or higher for read operations on tables with row filters/column masks.
3. For write operations (INSERT, UPDATE, DELETE, MERGE) on tables with row filters/column masks, you need Databricks Runtime 16.3 or higher.
4. If your workspace has firewall or outbound network restrictions, ports 8443-8451 must be open for the serverless compute enforcement to work.
If your cluster is on Databricks Runtime 15.3 or below with Dedicated access mode, row filters and column masks are completely blocked regardless of other settings.
VERIFYING YOUR CURRENT SETUP
Run this in a notebook to check your runtime version and access mode:
SELECT current_version()
Also check your cluster's access mode in the Compute UI. It will show either "Standard" or "Dedicated" under the cluster configuration.
QUICK SUMMARY
Compute Type | Row Filter/Column Mask Support ----------------------|-------------------------------------- SQL Warehouse | Fully supported, no extra config Standard access mode | Fully supported Dedicated + DBR 15.4+ | Supported (requires serverless enabled) Dedicated + DBR 15.3 | Not supported
RECOMMENDED NEXT STEPS
1. If possible, switch to a SQL warehouse or Standard access mode cluster for workloads that query masked tables.
2. If you must use Dedicated access mode, verify serverless is enabled on the workspace and upgrade to DBR 15.4 LTS or later.
3. If you recently applied row filters or column masks to tables and need to query them immediately, a SQL warehouse is the fastest path to resolution.
Documentation references:
- Row filters and column masks: https://docs.databricks.com/en/tables/row-and-column-filters.html
- Dedicated compute limitations: https://docs.databricks.com/en/compute/dedicated-limitations.html
- Compute access modes: https://docs.databricks.com/en/compute/configure.html
* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.
If this answer resolves your question, could you mark it as "Accept as Solution"? That helps other users quickly find the correct fix.