โ01-27-2026 10:11 PM
We need to transfer the Jobs/ETL Pipelines/Workflows/Workspace Notebooks from One Azure Subscription to another Azure subscription. Manual way of exporting the notebook and jobs is not feasible as we have around 100s of notebook and workflows. Suggest a suitable way
โ01-27-2026 11:57 PM
โ01-27-2026 11:02 PM
we can use Databricks asset Bundles and Teraform
โ01-27-2026 11:28 PM
@parvati_sharma8 Can you share some links which will provide step by step process to do this?
โ01-27-2026 11:57 PM
โ01-28-2026 12:40 AM
Asset Bundles are definitely a great approach, but as an alternative, if all your resources are already in Git, you can simply sync them to a new subscription.
โ01-28-2026 01:29 AM
@AJ270990 - It has to be the combination of Git Repo and Asset Bundles.
Using DAB, you could have your jobs/clusters definitions in the ADO and then from ADO you could get your code in the new subscription.
โ01-28-2026 05:36 PM
Thanks @Raman_Unifeye In our case we dont have ADO, so you meant Github actions?
โ01-28-2026 06:01 PM
If you cant use DABS for any reason terraform exporter utility would be helpful as well .
More information -
https://www.databricks.com/blog/2022/12/20/reuse-existing-workflows-through-terraform.html
https://medium.com/mphasis-datalytyx/portable-databricks-how-to-migrate-databricks-from-one-cloud-to...
โ01-28-2026 06:59 PM
If you dont have these resource in dabs already writing/testing configuration might be a good amount of work . with terraform exporter utility you can export all the resource from one workspace as terraform code and deploy it to new workspace quite easily with considerable less work .