05-09-2023 06:45 AM
Hello,
as i'm trying to create a CI/CD for the project, I'm finding myself stuck.
Tried to upload the Notebooks from my Azure DevOps Release and I'm getting 403-forbidden access.
I used 'cat ~/.databrickscfg file and matched with the local config that I have. They matched.
The problem is when I'm doing the 'databricks workspace import-dir' cmd, on local it goes fine. At the Azure DevOps it is failling.
Any clue why this can be happening?? The only issue is it is giving the auth issue on Azure DevOps but no locally.
05-13-2023 08:40 AM
@André Filipe Rosário Lima :
It's possible that the authentication credentials you're using to connect to Databricks are not properly set up in your Azure DevOps environment. Here are a few things to check:
Hopefully, one of these steps will help you resolve the authentication issue and successfully import your notebooks in your Azure DevOps release pipeline.
05-13-2023 08:40 AM
@André Filipe Rosário Lima :
It's possible that the authentication credentials you're using to connect to Databricks are not properly set up in your Azure DevOps environment. Here are a few things to check:
Hopefully, one of these steps will help you resolve the authentication issue and successfully import your notebooks in your Azure DevOps release pipeline.
05-20-2023 10:03 PM
Hi @André Filipe Rosário Lima
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
07-17-2023 04:30 AM
Hey everyone! 😍
I can totally relate to the frustration of encountering authentication issues when setting up a CI/CD pipeline. It's great that you're able to import the notebooks locally, but facing difficulties on Azure DevOps can be quite puzzling.
From what you've described, it seems like there might be an issue with the authentication setup in your Azure DevOps environment. Here are a few suggestions to help you troubleshoot:
First, double-check that you have created a Databricks access token in Azure DevOps and that it has the necessary permissions for the import operation. Permissions can sometimes be a tricky thing, so it's worth confirming.
Next, ensure that you're passing the access token correctly to Databricks in your import command. The --token flag or the DATABRICKS_TOKEN environment variable should be set properly.
It's also important to verify that the Databricks workspace URL you're using in your import command matches the one you're using locally. Inconsistent URLs can lead to authentication failures.
Additionally, review your network settings to make sure there are no firewall rules blocking traffic between Azure DevOps and the Databricks workspace. This step is often overlooked but can cause issues.
If you're still encountering problems, enabling debug logging in your import command might provide more insights. The --debug flag can be handy for this purpose.
So, If you are a devops enthusiast like me, it will be useful to keep an eye on new technologies! In my opinion, a very trending and informative article about the role of AI in devops technology: AI in DevOps: Unleashing the Power for Exceptional Software Results. I'm sure it will open your eyes to a lot.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group