cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Asset Bundles - job specific "run_as" user/service_principle

lilo_z
New Contributor III

Was wondering if this was possible, since a use case came up in my team. Would it be possible to use a different service principle for a single job than what is specified for that target environment? For example:

bundle:
  name: hello-bundle

resources:
  jobs:
    hello-job:
      name: hello-job
      tasks:
        - task_key: hello-task
          existing_cluster_id: 1234-567890-abcde123
          notebook_task:
            notebook_path: ./hello.py
run_as:
service_principle_name: principle_1
targets: dev: default: true
run_as:
service_principle_name: principle_2

 

1 ACCEPTED SOLUTION

Accepted Solutions

lilo_z
New Contributor III

Found a working solution, posting it here for anyone else hitting the same issue - trick was to redefine "resources" under the target you want to make an exception for:

bundle:
  name: hello_bundle

include:
  - resources/*.yml

targets:

  dev:
    workspace:
      host: host_name
      root_path: root_path
    mode: dev
    run_as:
      service_principal_name: DEFAULT_SP
    resources:
      jobs:
        Hello_Job:
          name: Hello_Job
          run_as:
            user_name: SECONDARY_SP/USER



View solution in original post

3 REPLIES 3

Kaniz
Community Manager
Community Manager

Hi @lilo_zIn the scenario you’ve described, you want to use a different service principal for a single job than what is specified for the target environment.

Let’s break it down:

  1. Bundle Configuration:

    • You have a bundle named hello-bundle.
    • Within this bundle, there’s a job called hello-job.
    • The job has a single task (hello-task) associated with it.
    • The task is a notebook task, and its notebook path is ./hello.py.
  2. Service Principals:

    • You’ve defined two service principals:
      • principle_1 (associated with the job itself).
      • principle_2 (associated with the dev target environment).
  3. Use Case:

    • You want to use principle_1 for the hello-job task, even though the default service principal for the dev target environment is principle_2.
  4. Feasibility:

    • Yes, it’s possible to achieve this by specifying the desired service principal at the task level, overriding the default one set for the target environment.

Here’s how you can modify your configuration to achieve this:

bundle:
  name: hello-bundle

resources:
  jobs:
    hello-job:
      name: hello-job
      tasks:
        - task_key: hello-task
          existing_cluster_id: 1234-567890-abcde123
          notebook_task:
            notebook_path: ./hello.py
          run_as:
            service_principle_name: principle_1  # Specify the desired service principal here

targets:
  dev:
    default: true
    run_as:
      service_principle_name: principle_2  # Default service principal for the dev target environment

By explicitly setting the run_as field within the hello-task, you can ensure that it uses principle_1 regardless of the default service principal for the dev environment. 🚀

lilo_z
New Contributor III

The above configuration isn't working as expected @Kaniz , is there specific syntax or placement within the yaml file expected within the asset bundle to make the service principle/user explicit? We haven't been able to find documentation that would indicate this to be the case.

lilo_z
New Contributor III

Found a working solution, posting it here for anyone else hitting the same issue - trick was to redefine "resources" under the target you want to make an exception for:

bundle:
  name: hello_bundle

include:
  - resources/*.yml

targets:

  dev:
    workspace:
      host: host_name
      root_path: root_path
    mode: dev
    run_as:
      service_principal_name: DEFAULT_SP
    resources:
      jobs:
        Hello_Job:
          name: Hello_Job
          run_as:
            user_name: SECONDARY_SP/USER



Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.