09-06-2024 02:09 AM
Hi Databricks support, I am looking for a standardized Databricks framework to update job definition using DevOps from non-production till it get productionized. Our current process of updating the Databricks job definition is as follows:
This is a cumbersone and error-prone process as there are a lot manual steps involved and if we miss any of the step while updating the workflow we have to start again and raise a new PR. Is there a way we could serve it as a self-service framework which is standardized as there are possibilities where we have to make frequent changes on daily basis and using above process is not the appropriate way to do.
Please suggest.
09-06-2024 02:15 AM
Hi, I think this is what DABS is for (databricks asset bundles) and more recently pyDABS which is a pythonic method of implementing DABS.
09-19-2024 06:45 PM
Hi from the Git folders/Repos PM:
DAB is the way to go, and we are working on an integration to author DABs directly in the workspace.
Here's a DAIS talk where the DAB PM and I demo'ed some recommendations for source controlling jobs: https://www.databricks.com/dataaisummit/session/path-production-databricks-project-cicd-seamless-inn...
09-06-2024 02:15 AM
Hi, I think this is what DABS is for (databricks asset bundles) and more recently pyDABS which is a pythonic method of implementing DABS.
09-19-2024 06:45 PM
Hi from the Git folders/Repos PM:
DAB is the way to go, and we are working on an integration to author DABs directly in the workspace.
Here's a DAIS talk where the DAB PM and I demo'ed some recommendations for source controlling jobs: https://www.databricks.com/dataaisummit/session/path-production-databricks-project-cicd-seamless-inn...
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now