cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

Grant permissions to existing catalogs/schemas using Databricks Asset Bundles

vjussiiih
New Contributor

Hi,

I’m trying to use Databricks Asset Bundles (DAB) to assign Unity Catalog grants to catalogs and schemas that already exist in my workspace.

These catalogs and schemas were not originally created through DAB, but I would now like to manage their grants using a bundle.

For example, I am adding an existing schema like this:

resources:
  schemas:
    schema_name:
      name: schema_name
      catalog_name: catalog_name
      grants:
        - principal: some_principal_name
          privileges:
            - USE_SCHEMA
            - SELECT
 

However, regarding schemas (or catalogs) originally created by DAB, I get the following warning:

This action will result in the deletion or recreation of the following UC schemas.
Any underlying data may be lost:

delete resources.schemas.dab_test_catalog
delete resources.schemas.dab_test_schema

Error: the deployment requires destructive actions, but current console does not support prompting.
Please specify --auto-approve if you would like to skip prompts and proceed

And when the schema already exists but was NOT created by DAB, the deployment fails with:

Error: cannot create resources.schemas.<name>: Schema already exists

My question:

Is there any supported way to use Databricks Asset Bundles to grant permissions on an existing catalog or schema without:

  • DAB trying to create the catalog/schema
  • DAB trying to delete and recreate them
  • DAB attempting destructive changes
  • Having to explicitly define every table/volume inside the schema

In short:
šŸ‘‰ I only want to apply grants, not manage the lifecycle of the schema/catalog itself.

What I’ve tried / observed:

  • If a schema is defined under resources.schemas, DAB treats it as a fully managed resource, meaning it wants to create or drop it.
  • This triggers destructive plans, because DAB does not support a ā€œgrant‑onlyā€ mode for Unity Catalog objects.
  • I also tried with catalogs, but Bundles always attempts to create the catalog via POST /catalogs, which requires CREATE CATALOG on the metastore — which I do not have.

Is there any non-destructive pattern or best practice for this scenario?

Something like managing only permissions for existing UC objects, without having DAB take ownership of the entire schema/catalog?

Thanks in advance to anyone who can clarify whether this is supported. SQL tasks might be the only safe option and that is what AIs are also suggesting.

1 REPLY 1

SteveOstrowski
Databricks Employee
Databricks Employee

Hi @vjussiiih,

Let me walk you through this. You are correct that DABs treat schemas and catalogs defined under "resources" as fully managed resources, which means they attempt to create them on first deploy and manage their full lifecycle. This is what causes both the "Schema already exists" error and the destructive delete/recreate warnings you are seeing.

The good news is there is a supported way to handle this. Here are the approaches depending on your situation:


APPROACH 1: USE "bundle deployment bind" FOR EXISTING SCHEMAS (RECOMMENDED)

The Databricks CLI supports a "bind" command that links a bundle-defined resource to an existing resource in your workspace. This tells DAB "this resource already exists, manage it going forward instead of trying to create a new one." Critically, bind does not recreate data or the resource itself.

Step 1 - Define the schema in your bundle YAML with the grants you want:

resources:
schemas:
schema_name:
name: schema_name
catalog_name: catalog_name
grants:
- principal: some_principal_name
privileges:
- USE_SCHEMA
- SELECT

Step 2 - Bind the bundle resource to the existing schema:

databricks bundle deployment bind schema_name catalog_name.schema_name -t your_target

The first argument ("schema_name") is the resource key you used in your YAML. The second argument is the full name of the existing schema in your workspace.

Step 3 - Deploy:

databricks bundle deploy -t your_target

After binding, the deploy will update the existing schema with your grant definitions instead of trying to create a new one or destroying the existing one.

The bind command supports the following resource types: app, cluster, dashboard, job, model_serving_endpoint, pipeline, quality_monitor, registered_model, schema, and volume.


APPROACH 2: SQL TASK IN A DAB JOB (FOR CATALOGS OR GRANT-ONLY MANAGEMENT)

Since the bind command does not currently support catalogs, and since you mentioned you do not have CREATE CATALOG privileges on the metastore, a SQL task within a DAB-managed job is a practical alternative. This approach works for both catalogs and schemas, and it keeps your grants versioned in source control without DAB managing the lifecycle of the UC objects.

resources:
jobs:
apply_uc_grants:
name: "apply-uc-grants"
tasks:
- task_key: "grant_permissions"
sql_task:
warehouse_id: ${var.warehouse_id}
file:
path: ./sql/apply_grants.sql

variables:
warehouse_id:
description: "SQL warehouse ID"
lookup:
warehouse: "your-warehouse-name"

Then create a file at sql/apply_grants.sql in your bundle:

-- Catalog-level grants
GRANT USE_CATALOG ON CATALOG catalog_name TO `some_principal_name`;
GRANT CREATE_SCHEMA ON CATALOG catalog_name TO `some_principal_name`;

-- Schema-level grants
GRANT USE_SCHEMA ON SCHEMA catalog_name.schema_name TO `some_principal_name`;
GRANT SELECT ON SCHEMA catalog_name.schema_name TO `some_principal_name`;

You can then run this job after deployment with "databricks bundle run apply_uc_grants" or schedule it to run periodically to enforce your grants. The identity running the SQL must have MANAGE or ownership on the target objects.


APPROACH 3: USE THE DATABRICKS TERRAFORM PROVIDER (IF YOU PREFER TERRAFORM)

Since your title mentions Terraform, it is worth noting that the Databricks Terraform provider has a "databricks_grants" resource that is designed specifically for this use case. It manages only the grants on an existing object without managing the object lifecycle:

resource "databricks_grants" "schema_grants" {
schema = "catalog_name.schema_name"

grant {
principal = "some_principal_name"
privileges = ["USE_SCHEMA", "SELECT"]
}

grant {
principal = "another_principal"
privileges = ["USE_SCHEMA", "SELECT", "MODIFY"]
}
}

resource "databricks_grants" "catalog_grants" {
catalog = "catalog_name"

grant {
principal = "some_principal_name"
privileges = ["USE_CATALOG"]
}
}

The Terraform "databricks_grants" resource does not attempt to create or destroy the catalog or schema. It only manages the permissions. This is documented here:
https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/grants


KEY THINGS TO NOTE

1. CLI version: Make sure you are using a recent version of the Databricks CLI. Schema binding has been available since v0.243.0. Run "databricks --version" to check.

2. Grants are declarative: When DAB applies grants, it sets them to exactly what you specify. Test in a non-production environment first to understand how this interacts with existing grants on the object.

3. Permissions required: The identity running the deploy or SQL must have sufficient privileges (typically ownership or MANAGE) on the target catalog/schema to grant permissions.

4. Feature request for "data sources": There is an open feature request on GitHub (https://github.com/databricks/cli/issues/3460) for a "sources" or "bind: false" concept in DABs that would let you reference existing resources without DAB owning them. This would make your exact use case even simpler in the future.


DOCUMENTATION REFERENCES

- Bundle deployment bind command:
https://docs.databricks.com/en/dev-tools/cli/bundle-commands.html

- Databricks Asset Bundles resources:
https://docs.databricks.com/en/dev-tools/bundles/resources.html

- Unity Catalog privileges reference:
https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/privileges.html

- Terraform databricks_grants resource:
https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/grants

- GitHub feature request for data sources in DABs:
https://github.com/databricks/cli/issues/3460


Hope this helps! Let me know if you have any questions about the bind workflow or the SQL task approach.

* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.