cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Logs from dlt-execution computes

lubiarzm1
Contributor

Hi guys!

I faced an issue with the permission model in Databricks.
Data engineers in my team are using a Pipeline that runs on serverless compute. The permissions for the pipeline are configured correctly, for example as follows:

resource "databricks_permissions" "pipelines_int_usage" {
  provider      = databricks
  depends_on = [data.databricks_pipelines.pipeline, module.groups]

  for_each = { for id in local.all_pipeline_ids : id => id }

  pipeline_id = each.value

  access_control {
    group_name       = module.groups.admins_name
    permission_level = "CAN_MANAGE"
  }

  access_control {
    group_name       = module.groups.data_engineers_name
    permission_level = "CAN_RUN"
  }

  access_control {
    group_name       = module.groups.data_analysts_name
    permission_level = "CAN_VIEW"
  }

  access_control {
    group_name       = module.groups.deployers_name
    permission_level = "CAN_MANAGE"
  }

  access_control {
    group_name       = module.groups.support_name
    permission_level = "CAN_MANAGE"
  }

  access_control {
    service_principal_name = data.databricks_service_principal.spn.application_id
    permission_level       = "IS_OWNER"
  }
}


And the whole configuration works correctly, but when it comes to reading the Driver Logs, the Data_Engineers group is experiencing permission issues โ€” even the support group is unable to access them. Where should the appropriate permissions be applied to allow these users to read the logs?

lubiarzm1_0-1776332612242.png

 

1 ACCEPTED SOLUTION

Accepted Solutions

Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @lubiarzm1,

For Lakeflow Spark Declarative Serverless Pipelines, this isnโ€™t controlled by the databricks_permissions block on the pipeline.

By default, only the pipeline owner and workspace admins can view the driver logs, even if other users have CAN_MANAGE / CAN_RUN on the pipeline. To let your Data_Engineers and support groups read the driver logs, you must relax the log ACL via Spark config. Check this page

If this answer resolves your question, could you mark it as โ€œAccept as Solutionโ€? That helps other users quickly find the correct fix.

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***

View solution in original post

2 REPLIES 2

Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @lubiarzm1,

For Lakeflow Spark Declarative Serverless Pipelines, this isnโ€™t controlled by the databricks_permissions block on the pipeline.

By default, only the pipeline owner and workspace admins can view the driver logs, even if other users have CAN_MANAGE / CAN_RUN on the pipeline. To let your Data_Engineers and support groups read the driver logs, you must relax the log ACL via Spark config. Check this page

If this answer resolves your question, could you mark it as โ€œAccept as Solutionโ€? That helps other users quickly find the correct fix.

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***

lubiarzm1
Contributor

Thanks a lot for help !