cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to upload a wheel file in Azure DevOps pipeline

vvk
New Contributor II

Hi, I am trying to upload a wheel file to Databricks workspace using Azure DevOps release pipeline to use it in the interactive cluster. I tried "databricks workspace import" command, but looks like it does not support .whl files. Hence, I tried to upload the wheel file to a unity catalog volume using "databricks fs cp" command. 

It is working in my local cli set up, but failing in the DevOps pipeline with authorization error  "Authorization failed. Your token may be expired or lack the valid scope". I am using the access token of a SP that has full access to the catalog in both DevOps pipeline and the local CLI set up. It is working fine in the local CLI but failing in DevOps pipeline. Any ideas would be greatly appreciated.

1 REPLY 1

vvk
New Contributor II

Hi,

I don't think there is any other issue with the pipeline set up as I am able to perform other actions successfully (e.g. import notebooks using databricks workspace import_dir). Only fs cp to the volume is throwing the authentication error. I double checked again and can confirm that the user has full access to the catalog where the volume resides. Debug shows that the following API call is throwing HTTP 403 error.

 


GET /api/2.0/dbfs/get-status?path=dbfs:/Volumes/<catalog_name>/bronze/libraries

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group