cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Does "databricks bundle deploy" clean up old files?

xhead
New Contributor II

I'm looking at this page (Databricks Asset Bundles development work tasks) in the Databricks documentation.

When repo assets are deployed to a databricks workspace, it is not clear if the "databricks bundle deploy" will remove files from the target workspace that aren't in the source repo. For example, if a repo contained a notebook named "test1.py" and had been deployed, but then "test1.py" was removed from the repo and a new notebook "test2.py" was created, what is the content of the target workspace after? I believe it will contain both "test1.py" and "test2.py".

Secondly, the description of "databricks bundle destroy" does not indicate that it would remove all files from the workspace - only that it will remove all the artifacts referenced by the bundle. So when the "test1.py" file has been removed from the repo, and the "databricks bundle destroy" is run, will it only remove "test2.py" (which has not yet been deployed)?

I am trying to determine how to ensure that the shared workspace contains only the files that are in the repo - that whatever I do in a release pipeline, I will only have the latest assets in the workspace that are in the repo, and none of the old files that were previously in the repo.

The semantics of "databricks bundle deploy" (in particular the term "deploy") would indicate to me that it should do a clean up of assets in the target workspace as part of the deployment.

But if that is not the case, then if I did a "databricks bundle destroy" prior to the "databricks bundle deploy", would that adequately clean up the target workspace? Or do I need to do something with "databricks fs rm" to delete all the files in the target workspace folder prior to the bundle deploy?

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz
Community Manager
Community Manager

Hi @xhead , 

When deploying repo assets to a Databricks workspace using the โ€œdatabricks bundle deployโ€ command, itโ€™s essential to understand how it interacts with existing files in the target workspace.

Letโ€™s address your concerns:

  1. The behaviour of โ€œdatabricks bundle deployโ€:

    • Deploying a bundle does not automatically remove files from the target workspace that are no longer present in the source repo.
    • For example, if you initially deployed a notebook named โ€œtest1.pyโ€, and later removed it from the repo while adding a new notebook โ€œtest2.pyโ€, both โ€œtest1.pyโ€ and โ€œtest2.pyโ€ will coexist in the target workspace.
    • The deployment process is additive, not subtractive.
  2. โ€œdatabricks bundle destroyโ€:

    • The purpose of โ€œdatabricks bundle destroyโ€ is to remove all previously-deployed jobs, pipelines, and artifacts that are defined in the bundle configuration files.
    • However, it does not remove other files in the workspace that were not part of the bundle.
    • So, if you run โ€œdatabricks bundle destroyโ€ after removing โ€œtest1.pyโ€ from the repo, it will only remove artifacts referenced by the bundle (if any), not other files like โ€œtest2.pyโ€.
  3. Ensuring Workspace Consistency:

    • To ensure that your shared workspace contains only the latest assets from the repo, consider the following steps:
      • Manual Cleanup:
        • Before deploying a new bundle, manually delete any old files in the workspace that are no longer part of the repo.
      • Pre-Bundle Cleanup:
        • Run โ€œdatabricks bundle destroyโ€ before deploying a new bundle. This will remove previously-deployed artifacts.
        • Optionally, use โ€œdatabricks fs rmโ€ to delete specific files or folders in the target workspace.
      • Automated Pipeline:
        • In your release pipeline, consider scripting the cleanup process.
        • Compare the files in the repo with those in the workspace and remove any discrepancies.
  4. Semantic Implications:

    • While the term โ€œdeployโ€ might imply cleanup, it focuses on adding or updating resources rather than removing existing ones.
    • The responsibility for workspace consistency lies with the user.

Remember to tailor your approach based on your specific requirements and workflow. Happy bundling!

View solution in original post

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @xhead , 

When deploying repo assets to a Databricks workspace using the โ€œdatabricks bundle deployโ€ command, itโ€™s essential to understand how it interacts with existing files in the target workspace.

Letโ€™s address your concerns:

  1. The behaviour of โ€œdatabricks bundle deployโ€:

    • Deploying a bundle does not automatically remove files from the target workspace that are no longer present in the source repo.
    • For example, if you initially deployed a notebook named โ€œtest1.pyโ€, and later removed it from the repo while adding a new notebook โ€œtest2.pyโ€, both โ€œtest1.pyโ€ and โ€œtest2.pyโ€ will coexist in the target workspace.
    • The deployment process is additive, not subtractive.
  2. โ€œdatabricks bundle destroyโ€:

    • The purpose of โ€œdatabricks bundle destroyโ€ is to remove all previously-deployed jobs, pipelines, and artifacts that are defined in the bundle configuration files.
    • However, it does not remove other files in the workspace that were not part of the bundle.
    • So, if you run โ€œdatabricks bundle destroyโ€ after removing โ€œtest1.pyโ€ from the repo, it will only remove artifacts referenced by the bundle (if any), not other files like โ€œtest2.pyโ€.
  3. Ensuring Workspace Consistency:

    • To ensure that your shared workspace contains only the latest assets from the repo, consider the following steps:
      • Manual Cleanup:
        • Before deploying a new bundle, manually delete any old files in the workspace that are no longer part of the repo.
      • Pre-Bundle Cleanup:
        • Run โ€œdatabricks bundle destroyโ€ before deploying a new bundle. This will remove previously-deployed artifacts.
        • Optionally, use โ€œdatabricks fs rmโ€ to delete specific files or folders in the target workspace.
      • Automated Pipeline:
        • In your release pipeline, consider scripting the cleanup process.
        • Compare the files in the repo with those in the workspace and remove any discrepancies.
  4. Semantic Implications:

    • While the term โ€œdeployโ€ might imply cleanup, it focuses on adding or updating resources rather than removing existing ones.
    • The responsibility for workspace consistency lies with the user.

Remember to tailor your approach based on your specific requirements and workflow. Happy bundling!

fbaxter
New Contributor II

With thew newer Datbricks CLI (v0.215.0) this seems to be broken.  Now I can't destroy a bundle if it doesn't exist - it used to be idempotent.  Now I get this error (shortned my deploy area to <ws> below:

Starting plan computation
Planning complete and persisted at <ws>/dab-stage/pytest/.databricks/bundle/new-cluster/terraform/plan

No resources to destroy in plan. Skipping destroy!
Error: open <ws>/dab-stage/pytest/.databricks/bundle/new-cluster/terraform/terraform.tfstate: no such file or directory
make: *** [test-on-cluster] Error 1

db_allrails
New Contributor II

Will you add a synchronization option that does not remove existing jobs and pipelines?

We are using DAB for DBT and generally it works well, however, lifecycling models is a bit of a issue at the moment ๐Ÿ™‚

xhead
New Contributor II

One further question:

  • The purpose of โ€œdatabricks bundle destroyโ€ is to remove all previously-deployed jobs, pipelines, and artifacts that are defined in the bundle configuration files.

Which bundle configuration files? The ones in the repo? Or are there bundle configuration files in the target workspace location that are used? If the previous version of the bundle contained a reference to test1.py and it has been deployed to a shared workspace, and the new version of the repo no longer contains test1.py, will the destroy command remove test1.py from the shared workspace? 

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.