cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Connect Azure DevOps pipeline to Private Link Databricks Workspace

JJ11
New Contributor

Hello, i have been trying to get a pipeline in Azure DevOps to communicate with a Databricks Workspace that uses private link connection. I have tried setting up a service connection using a service principle that is also attached to the workspace, however, that did not work. Any suggestions or guidance?

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @JJ11 , Integrating Azure DevOps with an Azure Databricks Workspace that uses a private link connection can be a bit tricky, but I’ll guide you through the process.

Here are some steps and suggestions to help you achieve this:

  1. Azure Private Link Support for Azure Databricks:

  2. Service Connection in Azure DevOps:

    • You mentioned that you’ve set up a service connection using a service principal. This is the right approach.
    • Double-check the following:
      • Ensure that the service principal has the necessary permissions to access the Databricks workspace.
      • Verify that the service principal is correctly attached to the workspace.
      • Confirm that the service connection details (such as client ID, secret, and tenant ID) are accurate.
      • Test the service connection to ensure it’s working as expected.
  3. CI/CD Workflow:

    • Follow the recommended CI/CD workflow for Databricks development with Azure DevOps:
      1. Create or use an existing repository with your third-party Git provider (e.g., GitHub, GitLab).
      2. Connect your local development machine to the same repository.
      3. Pull any existing artifacts (notebooks, code files, build scripts) to your local machine.
      4. Develop, update, and test artifacts locally.
      5. Push changes back to the repository.
      6. Set up an Azure DevOps pipeline to automatically build, test, and deploy to your Databricks workspac...3.
    • Configure your pipeline to trigger on specific events (e.g., repository pull requests) to keep your development workflow seamless.
  4. Pipeline Definition:

    • In your Azure DevOps pipeline, define the necessary stages:
      • Checkout: Pull the code from your repository.
      • Build: Compile and package your artifacts.
      • Test: Run any required tests.
      • Deploy to Databricks: Use the Databricks REST API to deploy your artifacts to the workspace.
    • Make sure to handle secrets (such as the service principal credentials) securely using Azure DevOps secrets or variable groups.
  5. Databricks Workspace Configuration:

  6. Debugging:

    • If you encounter issues, check the Azure DevOps pipeline logs for any errors.
    • Verify that the Databricks workspace logs provide relevant information.
    • Test the pipeline step by step to identify the point of failure.

Remember that CI/CD is a design pattern, and the steps outlined here can be adapted to other CI/CD tools. Good luck with your integration, and feel free to ask if you need further assistance! 🚀

 
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!