cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Asset Bundle wrongfully deleting job

carolregatt
New Contributor II
Hey so Ive just started to use DAB to automatically mange job configs via CICD
 
I had a previously-existing job (lets say ID 123) which was created manually and had this config
 
resources:
  jobs:
    My_Job_A:
      name: My Job A
 
And I wanted to automatically mange it via DAB. So I created a local yaml file with this config, which changed the job name from My Job A to A Different Name for Job A
 
targets:
  dev:
    resources:
      jobs:
        My_Job_A:
          name: A Different Name for Job A
 
First I had to BIND this target with my existing job, so I ran
databricks bundle deployment bind My_Job_A 123 --target dev --auto-approve
 
And then I was able to deploy the bundle
databricks bundle deploy --target dev
 
And it WORKED! My remote job 123 was now called A Different Name for Job A, and all of its previous runs were maintained.
However, its remote config looked like
resources:
 
  jobs:
    A_Different_Name_for_Job_A:
      name: A Different Name for Job A
 
and my local config still looked like
 
targets:
  dev:
    resources:
      jobs:
        My_Job_A:
          name: A Different Name for Job A
 
So I wrongfully though it would be best to have the keys matching between local and remote, so I change my local config to look exactly like the remote one:
targets:
  dev:
    resources:
      jobs:
        A_Different_Name_for_Job_A:
          name: A Different Name for Job A
 
HOWEVER, after deploying this change, my old job (ID 123) was DELETED and a new job (ID 456) was created in its place
 
Is this the expected behaviour?
 
This behaviour seems very dangerous since its not clear how job keys, job names and ids interact (or maybe Ive just not seen the documentation about it?), specially since its not possible to recover deleted jobs on my own 
1 ACCEPTED SOLUTION

Accepted Solutions

Advika
Databricks Employee
Databricks Employee

Hello @carolregatt!

In general, jobs are deleted when you either change the job key or explicitly run bundle destroy.
The job key in the YAML file uniquely identifies the job. When you change the key, it breaks the existing binding, and DAB treats it as a completely new resource. This leads to the original job being deleted and a new one created.
To avoid this, it's best to keep the YAML key the same when managing an existing job, even if you're updating the job name.

View solution in original post

2 REPLIES 2

Advika
Databricks Employee
Databricks Employee

Hello @carolregatt!

In general, jobs are deleted when you either change the job key or explicitly run bundle destroy.
The job key in the YAML file uniquely identifies the job. When you change the key, it breaks the existing binding, and DAB treats it as a completely new resource. This leads to the original job being deleted and a new one created.
To avoid this, it's best to keep the YAML key the same when managing an existing job, even if you're updating the job name.

carolregatt
New Contributor II

Thanks so much for the response @Advika !
That makes sense!

Can you explain why the remote config had a different key when compared to the local one? I guess that was what threw me off and made me want to change the local key to match the remote

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now