cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks bundle repository permission

HoussemBL
New Contributor III

Hi everyone,
How can I use Databricks Asset Bundle configuration to set permissions on the workspace folder (root_path) where my code is deployed, in order to protect it from manual changes by users?

My current bundle config for production looks like this:

 
 
prod:
  mode: production
  workspace:
     root_path: /Shared/.bundle/${bundle.name}/${bundle.target}

I would like to restrict write or manage permissions on this folder to prevent manual edits in the Databricks workspace path /Shared/.bundle/${bundle.name}/${bundle.target}

What would the syntax look like for a workspace directory permission?
I was searching in Databricks documentation and found only the following article which explains how to set permissions to Jobs/Pipelines/Dashboards/Models/Experiments.
https://docs.databricks.com/aws/en/dev-tools/bundles/permissions

Yet this article does not mention how to set permission for a specific path in the workspace.

5 REPLIES 5

ManojkMohan
Contributor III

Databricks Asset Bundle configuration currently does not natively support setting permissions directly on arbitrary workspace folders (such as the root_path folder where code is deployed) via the bundle YAML file. The documented and supported permission management in bundles applies only to specific resource types—Jobs, Pipelines, Dashboards, Models, and Experiments—not general workspace directories or folders https://docs.databricks.com/aws/en/dev-tools/bundles/permissions

Workaround and Best Practice
After deployment, manually set workspace folder permissions via the Databricks UI or use the Workspace REST API (/api/2.0/permissions/workspace/<path>) to script permission changes on the deployment folder https://docs.databricks.com/aws/en/security/auth/access-control/  , https://docs.databricks.com/aws/en/dev-tools/bundles/permissions   Consider automation via external tools/scripts to enforce permissions post-deployment, since bundle YAML does not cover workspace directory permissioning.  https://docs.databricks.com/api/workspace/workspace/setpermissions

Khaja_Zaffer
Contributor

Hello @HoussemBL 

As far I know, there is no direct syntax or configuration option within the bundle YAML (e.g., under permissions, workspace, or any other top-level key) to set permissions on workspace folders, directories, or the root_path itself. The permissions mapping in bundles is limited to supported resource types like jobs, pipelines, dashboards, models, and experiments. It applies access levels such as CAN_VIEW, CAN_MANAGE, or CAN_RUN (or resource-specific variants like CAN_EDIT or IS_OWNER) to those assets only.
 
Open for other solutions 

szymon_dybczak
Esteemed Contributor III

Hi @HoussemBL ,

In Databricks there is a users group to which by deafult all the workspace users belong (in UI displayed All workspace users). That group has default permission that cannot be revoked at the top-level Shared folder. 
So, any new folder under Shared folder will inherit CAN MANAGE permission for the user group.

In your case, you need to create folder outside Shared folder. In that way you can apply folder ACL to whatever group/user you want like this:

szymon_dybczak_0-1758274939092.png

 

 @szymon_dybczak  If I change the `root_path` in my bundle configuration from shared to a specific path, will all my existing pipelines and jobs be deleted with new instances created, or is it possible to retain my existing pipelines and jobs?
I am particularly concerned about DLT pipelines: if they are deleted, their associated databases and tables might also be lost, which I want to avoid.

szymon_dybczak
Esteemed Contributor III

Hi @HoussemBL ,

Good point here. I guess when you change root_path new resource will be created because in root path resides tf state that DAB uses to track state. 
Best way to check is to create dummy dlt pipeline on dev environment and deploy. Then change root path and deploy again. Then you will be sure 🙂

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now