<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Unable to upload a wheel file in Azure DevOps pipeline in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/unable-to-upload-a-wheel-file-in-azure-devops-pipeline/m-p/63470#M32246</link>
    <description>&lt;P&gt;Hi, I am trying to upload a wheel file to Databricks workspace using Azure DevOps release pipeline to use it in the interactive cluster. I tried "databricks workspace import" command, but looks like it does not support .whl files. Hence, I tried to upload the wheel file to a unity catalog volume using "databricks fs cp" command.&amp;nbsp;&lt;/P&gt;&lt;P&gt;It is working in my local cli set up, but failing in the DevOps pipeline with authorization error&amp;nbsp; "&lt;SPAN&gt;Authorization&amp;nbsp;failed.&amp;nbsp;Your&amp;nbsp;token&amp;nbsp;may&amp;nbsp;be&amp;nbsp;expired&amp;nbsp;or&amp;nbsp;lack&amp;nbsp;the&amp;nbsp;valid&amp;nbsp;scope". I am using the access token of a SP that has full access to the catalog in both DevOps pipeline and the local CLI set up. It is working fine in the local CLI but failing in DevOps pipeline. Any ideas would be greatly appreciated.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 13 Mar 2024 07:24:15 GMT</pubDate>
    <dc:creator>vvk</dc:creator>
    <dc:date>2024-03-13T07:24:15Z</dc:date>
    <item>
      <title>Unable to upload a wheel file in Azure DevOps pipeline</title>
      <link>https://community.databricks.com/t5/data-engineering/unable-to-upload-a-wheel-file-in-azure-devops-pipeline/m-p/63470#M32246</link>
      <description>&lt;P&gt;Hi, I am trying to upload a wheel file to Databricks workspace using Azure DevOps release pipeline to use it in the interactive cluster. I tried "databricks workspace import" command, but looks like it does not support .whl files. Hence, I tried to upload the wheel file to a unity catalog volume using "databricks fs cp" command.&amp;nbsp;&lt;/P&gt;&lt;P&gt;It is working in my local cli set up, but failing in the DevOps pipeline with authorization error&amp;nbsp; "&lt;SPAN&gt;Authorization&amp;nbsp;failed.&amp;nbsp;Your&amp;nbsp;token&amp;nbsp;may&amp;nbsp;be&amp;nbsp;expired&amp;nbsp;or&amp;nbsp;lack&amp;nbsp;the&amp;nbsp;valid&amp;nbsp;scope". I am using the access token of a SP that has full access to the catalog in both DevOps pipeline and the local CLI set up. It is working fine in the local CLI but failing in DevOps pipeline. Any ideas would be greatly appreciated.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 13 Mar 2024 07:24:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/unable-to-upload-a-wheel-file-in-azure-devops-pipeline/m-p/63470#M32246</guid>
      <dc:creator>vvk</dc:creator>
      <dc:date>2024-03-13T07:24:15Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to upload a wheel file in Azure DevOps pipeline</title>
      <link>https://community.databricks.com/t5/data-engineering/unable-to-upload-a-wheel-file-in-azure-devops-pipeline/m-p/63526#M32264</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I don't think there is any other issue with the pipeline set up as I am able to perform other actions successfully (e.g. import notebooks using&amp;nbsp;&lt;SPAN&gt;databricks&amp;nbsp;workspace&amp;nbsp;import_dir&lt;/SPAN&gt;). Only fs cp to the volume is throwing the authentication error. I double checked again and can confirm that the user has full access to the catalog where the volume resides. Debug shows that the following API call is throwing HTTP 403 error.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&lt;BR /&gt;GET&amp;nbsp;/api/2.0/dbfs/get-status?path=dbfs:/Volumes/&amp;lt;catalog_name&amp;gt;/bronze/libraries&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Wed, 13 Mar 2024 11:38:20 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/unable-to-upload-a-wheel-file-in-azure-devops-pipeline/m-p/63526#M32264</guid>
      <dc:creator>vvk</dc:creator>
      <dc:date>2024-03-13T11:38:20Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to upload a wheel file in Azure DevOps pipeline</title>
      <link>https://community.databricks.com/t5/data-engineering/unable-to-upload-a-wheel-file-in-azure-devops-pipeline/m-p/106540#M42513</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/102226"&gt;@vvk&lt;/a&gt; -&amp;nbsp;The HTTP 403 error typically indicates a permissions issue. Ensure that the SP has the necessary permissions to perform the &lt;CODE&gt;fs cp&lt;/CODE&gt; operation on the specified path. Verify that the path specified in the &lt;CODE&gt;fs cp&lt;/CODE&gt; command is correct and that the volume exists.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jan 2025 18:53:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/unable-to-upload-a-wheel-file-in-azure-devops-pipeline/m-p/106540#M42513</guid>
      <dc:creator>Satyadeepak</dc:creator>
      <dc:date>2025-01-21T18:53:40Z</dc:date>
    </item>
  </channel>
</rss>

