cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Transfer of Jobs/ETL Pipelines/Workflows/Workspace Notebooks from One Subscription to another sub

AJ270990
Contributor II

We need to transfer the Jobs/ETL Pipelines/Workflows/Workspace Notebooks from One Azure Subscription to another Azure subscription. Manual way of exporting the notebook and jobs is not feasible as we have around 100s of notebook and workflows. Suggest a suitable way

5 REPLIES 5

parvati_sharma8
New Contributor II

we can use Databricks asset Bundles and Teraform

@parvati_sharma8 Can you share some links which will provide step by step process to do this?

Kirankumarbs
Visitor

Asset Bundles are definitely a great approach, but as an alternative, if all your resources are already in Git, you can simply sync them to a new subscription.

Raman_Unifeye
Contributor III

@AJ270990 - It has to be the combination of Git Repo and Asset Bundles. 

Using DAB, you could have your jobs/clusters definitions in the ADO and then from ADO you could get your code in the new subscription.


RG #Driving Business Outcomes with Data Intelligence