cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Cannot import-dir from AzureDevops, but works fine locally.

tirato
New Contributor II

Hello,

as i'm trying to create a CI/CD for the project, I'm finding myself stuck.

Tried to upload the Notebooks from my Azure DevOps Release and I'm getting 403-forbidden access.

I used 'cat ~/.databrickscfg file and matched with the local config that I have. They matched.

The problem is when I'm doing the 'databricks workspace import-dir' cmd, on local it goes fine. At the Azure DevOps it is failling.

Any clue why this can be happening?? The only issue is it is giving the auth issue on Azure DevOps but no locally.

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

@André Filipe Rosário Lima​ :

It's possible that the authentication credentials you're using to connect to Databricks are not properly set up in your Azure DevOps environment. Here are a few things to check:

  1. Make sure that you have created a Databricks access token with the appropriate permissions to perform the import operation in your Azure DevOps release pipeline.
  2. Check that you're passing the access token to Databricks correctly in your import command. You can use the --token flag followed by the token value, or set the DATABRICKS_TOKEN environment variable to the token value.
  3. Verify that the Databricks workspace URL you're using in your import command matches the URL you're using locally. If you're using different Databricks environments, they may have different URLs.
  4. Check that your network settings allow traffic from Azure DevOps to the Databricks workspace. Depending on your network setup, there may be firewall rules that are blocking traffic from your Azure DevOps environment.
  5. If none of these steps resolve the issue, try enabling debug logging in your import command to see if it provides more information about the error. You can do this by adding the --debug flag to your command.

Hopefully, one of these steps will help you resolve the authentication issue and successfully import your notebooks in your Azure DevOps release pipeline.

View solution in original post

3 REPLIES 3

Anonymous
Not applicable

@André Filipe Rosário Lima​ :

It's possible that the authentication credentials you're using to connect to Databricks are not properly set up in your Azure DevOps environment. Here are a few things to check:

  1. Make sure that you have created a Databricks access token with the appropriate permissions to perform the import operation in your Azure DevOps release pipeline.
  2. Check that you're passing the access token to Databricks correctly in your import command. You can use the --token flag followed by the token value, or set the DATABRICKS_TOKEN environment variable to the token value.
  3. Verify that the Databricks workspace URL you're using in your import command matches the URL you're using locally. If you're using different Databricks environments, they may have different URLs.
  4. Check that your network settings allow traffic from Azure DevOps to the Databricks workspace. Depending on your network setup, there may be firewall rules that are blocking traffic from your Azure DevOps environment.
  5. If none of these steps resolve the issue, try enabling debug logging in your import command to see if it provides more information about the error. You can do this by adding the --debug flag to your command.

Hopefully, one of these steps will help you resolve the authentication issue and successfully import your notebooks in your Azure DevOps release pipeline.

Anonymous
Not applicable

Hi @André Filipe Rosário Lima​ 

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 

valeryuaba
New Contributor III

Hey everyone! 😍

I can totally relate to the frustration of encountering authentication issues when setting up a CI/CD pipeline. It's great that you're able to import the notebooks locally, but facing difficulties on Azure DevOps can be quite puzzling.

From what you've described, it seems like there might be an issue with the authentication setup in your Azure DevOps environment. Here are a few suggestions to help you troubleshoot:

First, double-check that you have created a Databricks access token in Azure DevOps and that it has the necessary permissions for the import operation. Permissions can sometimes be a tricky thing, so it's worth confirming.

Next, ensure that you're passing the access token correctly to Databricks in your import command. The --token flag or the DATABRICKS_TOKEN environment variable should be set properly.

It's also important to verify that the Databricks workspace URL you're using in your import command matches the one you're using locally. Inconsistent URLs can lead to authentication failures.

Additionally, review your network settings to make sure there are no firewall rules blocking traffic between Azure DevOps and the Databricks workspace. This step is often overlooked but can cause issues.


If you're still encountering problems, enabling debug logging in your import command might provide more insights. The --debug flag can be handy for this purpose.

So, If you are a devops enthusiast like me, it will be useful to keep an eye on new technologies! In my opinion, a very trending and informative article about the role of AI in devops technology: AI in DevOps: Unleashing the Power for Exceptional Software Results. I'm sure it will open your eyes to a lot.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group