Hey! I was deploying a new Databricks Workflow into my workspace via Databricks Asset Bundles. Currently, I have a very simple workflow, defined in a YAML file like this:
resources:
jobs:
example_job:
name: example_job
schedule:
quartz_cron_expression: 4 10 0 * * ?
timezone_id: UTC
pause_status: PAUSED
tasks:
- task_key: example_task
notebook_task:
notebook_path: ../src/notebook.ipynb
job_cluster_key: job_cluster
max_retries: 1
min_retry_interval_millis: 1800000
job_clusters:
- job_cluster_key: job_cluster
new_cluster:
policy_id: ${var.job_cluster_policy_id}
spark_version: ${var.job_cluster_spark_version}
node_type_id: ${var.job_cluster_default_node_type}
autoscale:
min_workers: 2
max_workers: 5
When I run databricks bundle validate and databricks bundle deploy, everything works, and the workflow is succesfully deployed into my workspace. And I can see the deployed workflow in my workspace, because I'm an admin in this workspace. However, my colleagues (which are not admins) cannot see the workflow in the workspace, due to lack of permissions.
Therefore, I was looking into how can I modify this YAML file in my Asset Bundle, to add the necessary permissions so that they can see the workflow in the workspace. Looking into the official documentation for the "Create new job API endpoint" (https://docs.databricks.com/api/workspace/jobs/create#access-control-list ), I thought that the "access_control_list" key was the one that I needed to use to add such permissions. So I've modified my YAML file like this:
However, with this YAML file, if I run "databricks bundle deploy" on it, the following warning appears:
resources:
jobs:
example_job:
name: example_job
schedule:
quartz_cron_expression: 4 10 0 * * ?
timezone_id: UTC
pause_status: PAUSED
tasks:
- task_key: example_task
notebook_task:
notebook_path: ../src/notebook.ipynb
job_cluster_key: job_cluster
max_retries: 1
min_retry_interval_millis: 1800000
job_clusters:
- job_cluster_key: job_cluster
new_cluster:
policy_id: ${var.job_cluster_policy_id}
spark_version: ${var.job_cluster_spark_version}
node_type_id: ${var.job_cluster_default_node_type}
autoscale:
min_workers: 2
max_workers: 5
access_control_list:
group_name : datafarm_scitell_dt_data_engineers
permission_level: CAN_VIEW
Warning: unknown field: access_control_list
at resources.jobs.test_job.yaml
in jobs/test_job.yaml:30:7
And when I look into the workflow deployed in the workspace, the permissions of the workflow are still the same, and, therefore, my colleagues still can't see the job in the workspace.
How can I fix this?