cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to upload a wheel file in Azure DevOps pipeline

vvk
New Contributor II

Hi, I am trying to upload a wheel file to Databricks workspace using Azure DevOps release pipeline to use it in the interactive cluster. I tried "databricks workspace import" command, but looks like it does not support .whl files. Hence, I tried to upload the wheel file to a unity catalog volume using "databricks fs cp" command. 

It is working in my local cli set up, but failing in the DevOps pipeline with authorization error  "Authorization failed. Your token may be expired or lack the valid scope". I am using the access token of a SP that has full access to the catalog in both DevOps pipeline and the local CLI set up. It is working fine in the local CLI but failing in DevOps pipeline. Any ideas would be greatly appreciated.

2 REPLIES 2

vvk
New Contributor II

Hi,

I don't think there is any other issue with the pipeline set up as I am able to perform other actions successfully (e.g. import notebooks using databricks workspace import_dir). Only fs cp to the volume is throwing the authentication error. I double checked again and can confirm that the user has full access to the catalog where the volume resides. Debug shows that the following API call is throwing HTTP 403 error.

 


GET /api/2.0/dbfs/get-status?path=dbfs:/Volumes/<catalog_name>/bronze/libraries

Satyadeepak
Databricks Employee
Databricks Employee

Hi @vvk - The HTTP 403 error typically indicates a permissions issue. Ensure that the SP has the necessary permissions to perform the fs cp operation on the specified path. Verify that the path specified in the fs cp command is correct and that the volume exists.