Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2023 07:03 AM
Hello, I have to export all my notebooks from DEV to PROD.
My problem is that I can't find a way to export my jobs (not the outputs, the actual notebook schedulation), is it even possible? I have hundreds of jobs to export and have to keep the same parameters, it would be waaaay faster to export and import them instead of doing all of them manually.
Thank you 🤠
1 ACCEPTED SOLUTION
Accepted Solutions
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2023 07:11 AM
You can export the complete job configuration in different ways:
- You can use the REST API: https://docs.databricks.com/api/azure/workspace/jobs/get
- You can use the python SDK: https://github.com/databricks/databricks-sdk-py
- You can use Terraform:
Databricks SDK for Python (Beta). Contribute to databricks/databricks-sdk-py development by creating an account on GitHub.
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2023 07:11 AM
You can export the complete job configuration in different ways:
- You can use the REST API: https://docs.databricks.com/api/azure/workspace/jobs/get
- You can use the python SDK: https://github.com/databricks/databricks-sdk-py
- You can use Terraform:
Databricks SDK for Python (Beta). Contribute to databricks/databricks-sdk-py development by creating an account on GitHub.

