cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Delta Live Tables and GIT

Henrik
New Contributor III

Notebooks that runs in a delta live table are GIT enabled, but what about the Delta Live Table pipeline?

I'm looking for a good way to deploy pipelines from DEV to TEST and TEST to PROD that not just deploy the notebooks but also the pipeline.

What posibilities do I have?

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @Henrik , 

- Delta Live Tables (DLT) pipelines can be managed and deployed using Databricks API.
- The API allows you to create, edit, delete, start, and view pipeline details.
- You can develop and test DLT pipelines in a DEV environment.
- Use the Databricks API to create and start the pipeline in a TEST environment.
- You can configure the pipeline using the API, including cluster configuration, pipeline mode, storage location, notifications, and other settings.
- After testing, use the Databricks API to create and start the pipeline in a PROD environment.
- The API can also be used to edit or delete pipelines.
- Git can be used for version control of notebooks.
- Databricks supports running jobs using notebooks located in a remote Git repository.
- Code snippets for using the Databricks API are provided.
- The code can be used to create a pipeline and start a pipeline.

View solution in original post

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @Henrik , 

- Delta Live Tables (DLT) pipelines can be managed and deployed using Databricks API.
- The API allows you to create, edit, delete, start, and view pipeline details.
- You can develop and test DLT pipelines in a DEV environment.
- Use the Databricks API to create and start the pipeline in a TEST environment.
- You can configure the pipeline using the API, including cluster configuration, pipeline mode, storage location, notifications, and other settings.
- After testing, use the Databricks API to create and start the pipeline in a PROD environment.
- The API can also be used to edit or delete pipelines.
- Git can be used for version control of notebooks.
- Databricks supports running jobs using notebooks located in a remote Git repository.
- Code snippets for using the Databricks API are provided.
- The code can be used to create a pipeline and start a pipeline.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!