cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Transfer of Jobs/ETL Pipelines/Workflows/Workspace Notebooks from One Subscription to another sub

AJ270990
Contributor III

We need to transfer the Jobs/ETL Pipelines/Workflows/Workspace Notebooks from One Azure Subscription to another Azure subscription. Manual way of exporting the notebook and jobs is not feasible as we have around 100s of notebook and workflows. Suggest a suitable way

1 ACCEPTED SOLUTION

Accepted Solutions

Try Databricks today: https://dbricks.co/3EAWLK6. This video introduces Databricks Asset Bundles (DABs) as a solution to simplify and standardize CI/CD pipelines on Databricks. It highlights the challenges of existing tools, which were often complex, incomplete, or lacked native support. The video
8 REPLIES 8

parvati_sharma8
Databricks Partner

we can use Databricks asset Bundles and Teraform

@parvati_sharma8 Can you share some links which will provide step by step process to do this?

Try Databricks today: https://dbricks.co/3EAWLK6. This video introduces Databricks Asset Bundles (DABs) as a solution to simplify and standardize CI/CD pipelines on Databricks. It highlights the challenges of existing tools, which were often complex, incomplete, or lacked native support. The video

Kirankumarbs
Contributor

Asset Bundles are definitely a great approach, but as an alternative, if all your resources are already in Git, you can simply sync them to a new subscription.

Raman_Unifeye
Honored Contributor III

@AJ270990 - It has to be the combination of Git Repo and Asset Bundles. 

Using DAB, you could have your jobs/clusters definitions in the ADO and then from ADO you could get your code in the new subscription.


RG #Driving Business Outcomes with Data Intelligence

Thanks @Raman_Unifeye In our case we dont have ADO, so you meant Github actions?

pradeep_singh
Contributor III

If you cant use DABS for any reason terraform exporter utility would be helpful as well . 

More information - 
https://www.databricks.com/blog/2022/12/20/reuse-existing-workflows-through-terraform.html
https://medium.com/mphasis-datalytyx/portable-databricks-how-to-migrate-databricks-from-one-cloud-to...

 

Thank You
Pradeep Singh - https://www.linkedin.com/in/dbxdev

pradeep_singh
Contributor III

If you dont have these resource in dabs already writing/testing configuration might be a good amount of work . with terraform exporter utility you can export all the resource from one workspace as terraform code and deploy it to new workspace quite easily with considerable less work .

Thank You
Pradeep Singh - https://www.linkedin.com/in/dbxdev