<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: DLT Pipeline unable to find custom Libraries/Wheel packages in Get Started Discussions</title>
    <link>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/43960#M965</link>
    <description>&lt;P&gt;FWIW "Unrestricted Single User" clusters work fine - shared compute of any description appears to run into this issue.&lt;/P&gt;</description>
    <pubDate>Thu, 07 Sep 2023 11:58:03 GMT</pubDate>
    <dc:creator>ColibriMike</dc:creator>
    <dc:date>2023-09-07T11:58:03Z</dc:date>
    <item>
      <title>DLT Pipeline unable to find custom Libraries/Wheel packages</title>
      <link>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/38987#M646</link>
      <description>&lt;P&gt;We have our DLT pipeline and we need to import our custom libraries packaged in wheel files.&lt;/P&gt;&lt;P&gt;We are on Azure DBX and we are using Az DevOps CI/CD to build and deploy the wheel packages on our DBX environment.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In the top of our DLT notebook we are importing the wheel package as below&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;%pip install /dbfs/Libraries/whls/{wheel_file_name}.whl&lt;/LI-CODE&gt;&lt;P&gt;On execution of the pipeline we get the below error&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;CalledProcessError: Command 'pip --disable-pip-version-check install /dbfs/Libraries/whls/{wheel_file_name}.whl' returned non-zero exit status 1.,None,Map(),Map(),List(),List(),Map())&lt;/LI-CODE&gt;&lt;P&gt;And from the logs you can see that the file is not accessible:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Python interpreter will be restarted.
WARNING: Requirement '/dbfs/Libraries/whls/{wheel_file_name}.whl' looks like a filename, but the file does not exist
Processing /dbfs/Libraries/whls/{wheel_file_name}.whl
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory: '/dbfs/Libraries/whls/{wheel_file_name}.whl'&lt;/LI-CODE&gt;&lt;P&gt;knowing that the file exists already, when checked from the DBFS Explore UI screen.&lt;/P&gt;&lt;P&gt;We've tried to list the available folders and files accessible by the DLT Pipeline node and we got the below:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Files in the ROOT Directory: ['mnt', 'tmp', 'local_disk0', 'dbfs', 'Volumes', 'Workspace', . . . . .]

Files in the ROOT/dbfs Directory: []&lt;/LI-CODE&gt;&lt;P&gt;As you can see dbfs looks empty and it doesn't contain any folder or file, which we can see and access from the DBFS explorer ui portal.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Volumes and Workspace files are accessible from the pipeline, but:&lt;/P&gt;&lt;P&gt;- Uploading to Volumes giving Error uploading without additional details to know the issue, even uploading manually from the UI&lt;/P&gt;&lt;P&gt;- Workspace/shared...: Files are accessible but the problem that it's not working with CI/CD pipelines to automatically push wheel files from there, so we need to upload them manually.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any idea, how can we overcome this, and to be able to upload the wheel files via Azure DevOps to the DBX environment and to be able to import them in our DLT pipelines?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 03 Aug 2023 07:13:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/38987#M646</guid>
      <dc:creator>Fz1</dc:creator>
      <dc:date>2023-08-03T07:13:11Z</dc:date>
    </item>
    <item>
      <title>Re: DLT Pipeline unable to find custom Libraries/Wheel packages</title>
      <link>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/43953#M964</link>
      <description>&lt;P&gt;Exactly the same issue here - please say if you find a solution.&lt;/P&gt;</description>
      <pubDate>Thu, 07 Sep 2023 11:38:45 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/43953#M964</guid>
      <dc:creator>ColibriMike</dc:creator>
      <dc:date>2023-09-07T11:38:45Z</dc:date>
    </item>
    <item>
      <title>Re: DLT Pipeline unable to find custom Libraries/Wheel packages</title>
      <link>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/43960#M965</link>
      <description>&lt;P&gt;FWIW "Unrestricted Single User" clusters work fine - shared compute of any description appears to run into this issue.&lt;/P&gt;</description>
      <pubDate>Thu, 07 Sep 2023 11:58:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/43960#M965</guid>
      <dc:creator>ColibriMike</dc:creator>
      <dc:date>2023-09-07T11:58:03Z</dc:date>
    </item>
    <item>
      <title>Re: DLT Pipeline unable to find custom Libraries/Wheel packages</title>
      <link>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/43964#M966</link>
      <description>&lt;LI-CODE lang="javascript"&gt;"context_based_upload_for_execute": true&lt;/LI-CODE&gt;&lt;P&gt;in projects.json allowed the code to run - but ended with&lt;/P&gt;&lt;LI-CODE lang="python"&gt;RuntimeError: Cannot start a remote Spark session because there is a regular Spark session already running.&lt;/LI-CODE&gt;</description>
      <pubDate>Thu, 07 Sep 2023 13:15:16 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/43964#M966</guid>
      <dc:creator>ColibriMike</dc:creator>
      <dc:date>2023-09-07T13:15:16Z</dc:date>
    </item>
    <item>
      <title>Re: DLT Pipeline unable to find custom Libraries/Wheel packages</title>
      <link>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/120382#M10102</link>
      <description>&lt;P&gt;You might want to verify the file path and permissions within your CI/CD process—sometimes the context in which the pipeline runs lacks proper DBFS mount visibility. We've encountered similar visibility inconsistencies while working on data aggregation with &lt;STRONG&gt;&lt;A href="https://arrestss-ky.org/" target="_blank"&gt;Kentucky county arrest records&lt;/A&gt;&lt;/STRONG&gt; resources, and ensuring runtime access alignment helped resolve it. Try referencing the file using prefix or adjusting pipeline permissions.&lt;/P&gt;</description>
      <pubDate>Wed, 28 May 2025 08:05:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/dlt-pipeline-unable-to-find-custom-libraries-wheel-packages/m-p/120382#M10102</guid>
      <dc:creator>Laurence_Fishbu</dc:creator>
      <dc:date>2025-05-28T08:05:27Z</dc:date>
    </item>
  </channel>
</rss>

