<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Installing Custom Packages on Serverless Compute via Databricks Connect in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142914#M52047</link>
    <description>&lt;DIV&gt;&lt;SPAN&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/25346"&gt;@Hubert-Dudek&lt;/a&gt;&amp;nbsp;,&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;Thank you for your comment. I did try this as well, but no luck.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;Uploaded the wheel package in the volume (read access to all users), lets say at this path:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;/Volumes/main/my@my.com/test/dice-4.0.0-py3-none-any.whl&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;```python&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;volumes_deps = [&lt;/SPAN&gt;&lt;SPAN&gt;"dbfs:/Volumes/main/my@my.com/test/dice-4.0.0-py3-none-any.whl"&lt;/SPAN&gt;&lt;SPAN&gt;]&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;env = DatabricksEnv().withDependencies(volumes_deps)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;spark = DatabricksSession.builder.serverless().withEnvironment(env).getOrCreate()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;```&lt;BR /&gt;Also, noticed something strange in&amp;nbsp;DatabricksEnv source code, it only verifies libraries installed in local virtual environment, I may be wrong here at interpreting the source code.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Btw, I have been following your blog for some time now, thoroughly enjoy reading the articles.&lt;/SPAN&gt;&lt;/DIV&gt;</description>
    <pubDate>Sun, 04 Jan 2026 00:48:24 GMT</pubDate>
    <dc:creator>ganesh_raskar</dc:creator>
    <dc:date>2026-01-04T00:48:24Z</dc:date>
    <item>
      <title>Installing Custom Packages on Serverless Compute via Databricks Connect</title>
      <link>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142880#M52042</link>
      <description>&lt;P&gt;I have a custom Python package that provides a PySpark DataSource implementation. I'm using Databricks Connect (16.4.10) and need to understand package installation options for serverless compute.&lt;/P&gt;&lt;P&gt;Works: Traditional Compute Cluster&lt;/P&gt;&lt;P&gt;Custom package pre-installed on cluster&lt;BR /&gt;spark = DatabricksSession.builder.clusterId("my-cluster-id").getOrCreate()&lt;BR /&gt;spark.dataSource.register(MyCustomDataSource)&lt;BR /&gt;df = spark.read.format("my_format").load()&lt;BR /&gt;Works perfectly&lt;/P&gt;&lt;P&gt;Doesn't Work: Serverless Compute&lt;/P&gt;&lt;P&gt;Custom package not available&lt;BR /&gt;spark = DatabricksSession.builder.serverless().getOrCreate()&lt;BR /&gt;spark.dataSource.register(MyCustomDataSource)&lt;BR /&gt;df = spark.read.format("my_format").load()&lt;BR /&gt;Error&lt;/P&gt;&lt;P&gt;What I've Tried&lt;/P&gt;&lt;P&gt;I attempted to use DatabricksEnv().withDependencies():&lt;/P&gt;&lt;P&gt;env = DatabricksEnv().withDependencies(["my-custom-package==0.4.0"])&lt;BR /&gt;spark = DatabricksSession.builder.serverless().withEnvironment(env).getOrCreate()&lt;/P&gt;&lt;P&gt;However, based on the documentation, withDependencies() appears to only work for Python UDFs, not for packages that need to be available at the driver or session level for custom DataSource registration.&lt;/P&gt;&lt;P&gt;Questions&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;P&gt;Is there a way to install custom packages on serverless compute when using Databricks Connect?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Is support for custom package installation on serverless compute (similar to cluster libraries) on the roadmap?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Are there any workarounds to make custom DataSources work with serverless compute?&lt;/P&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Environment&lt;/P&gt;&lt;P&gt;Databricks Connect: 16.4.10&lt;BR /&gt;Python: 3.12&lt;BR /&gt;Custom package: Installed locally via pip, provides PySpark DataSource V2 API implementation&lt;/P&gt;&lt;P&gt;Additional Context&lt;BR /&gt;The custom package works perfectly with serverless environment in a notebook.&lt;BR /&gt;&lt;BR /&gt;Links&lt;BR /&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/databricks-connect/cluster-config#remote-meth" target="_blank"&gt;https://docs.databricks.com/aws/en/dev-tools/databricks-connect/cluster-config#remote-meth&lt;/A&gt;&lt;BR /&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/databricks-connect/python/udf#base-env" target="_blank"&gt;https://docs.databricks.com/aws/en/dev-tools/databricks-connect/python/udf#base-env&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 03 Jan 2026 06:24:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142880#M52042</guid>
      <dc:creator>ganesh_raskar</dc:creator>
      <dc:date>2026-01-03T06:24:17Z</dc:date>
    </item>
    <item>
      <title>Re: Installing Custom Packages on Serverless Compute via Databricks Connect</title>
      <link>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142903#M52046</link>
      <description>&lt;P&gt;Just put the wheel on volume and add it to the environment?&lt;/P&gt;</description>
      <pubDate>Sat, 03 Jan 2026 20:29:41 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142903#M52046</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2026-01-03T20:29:41Z</dc:date>
    </item>
    <item>
      <title>Re: Installing Custom Packages on Serverless Compute via Databricks Connect</title>
      <link>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142914#M52047</link>
      <description>&lt;DIV&gt;&lt;SPAN&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/25346"&gt;@Hubert-Dudek&lt;/a&gt;&amp;nbsp;,&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;Thank you for your comment. I did try this as well, but no luck.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;Uploaded the wheel package in the volume (read access to all users), lets say at this path:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;/Volumes/main/my@my.com/test/dice-4.0.0-py3-none-any.whl&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;```python&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;volumes_deps = [&lt;/SPAN&gt;&lt;SPAN&gt;"dbfs:/Volumes/main/my@my.com/test/dice-4.0.0-py3-none-any.whl"&lt;/SPAN&gt;&lt;SPAN&gt;]&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;env = DatabricksEnv().withDependencies(volumes_deps)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;spark = DatabricksSession.builder.serverless().withEnvironment(env).getOrCreate()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;```&lt;BR /&gt;Also, noticed something strange in&amp;nbsp;DatabricksEnv source code, it only verifies libraries installed in local virtual environment, I may be wrong here at interpreting the source code.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Btw, I have been following your blog for some time now, thoroughly enjoy reading the articles.&lt;/SPAN&gt;&lt;/DIV&gt;</description>
      <pubDate>Sun, 04 Jan 2026 00:48:24 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142914#M52047</guid>
      <dc:creator>ganesh_raskar</dc:creator>
      <dc:date>2026-01-04T00:48:24Z</dc:date>
    </item>
    <item>
      <title>Re: Installing Custom Packages on Serverless Compute via Databricks Connect</title>
      <link>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142915#M52048</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/87032"&gt;@ganesh_raskar&lt;/a&gt;&amp;nbsp; - Can you try to do pip install in a notebook shell first and then used the library, you can give a try on this. what is the package you want to install, please provide that detail, I will give&amp;nbsp; try.&lt;/P&gt;&lt;P&gt;Regards - San&lt;/P&gt;</description>
      <pubDate>Sun, 04 Jan 2026 06:00:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142915#M52048</guid>
      <dc:creator>Sanjeeb2024</dc:creator>
      <dc:date>2026-01-04T06:00:36Z</dc:date>
    </item>
    <item>
      <title>Re: Installing Custom Packages on Serverless Compute via Databricks Connect</title>
      <link>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142923#M52053</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/129689"&gt;@Sanjeeb2024&lt;/a&gt;&amp;nbsp;It works perfectly fine in notebook by either installing !pip install or pre-install on serverless environment that notebook is attached to.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;It just that with spark connect with serverless compute, I don't see an option to install it.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;I also tried configuring default workspace serverless environment but that applies to notebook and jobs. It does not apply to spark connect sessions.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 04 Jan 2026 15:03:13 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142923#M52053</guid>
      <dc:creator>ganesh_raskar</dc:creator>
      <dc:date>2026-01-04T15:03:13Z</dc:date>
    </item>
    <item>
      <title>Re: Installing Custom Packages on Serverless Compute via Databricks Connect</title>
      <link>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142927#M52054</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/87032"&gt;@ganesh_raskar&lt;/a&gt;&amp;nbsp;- If you can provide which custom package and exact code and error, I can try to replicate at my end and explore the suitable option.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 04 Jan 2026 16:58:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/installing-custom-packages-on-serverless-compute-via-databricks/m-p/142927#M52054</guid>
      <dc:creator>Sanjeeb2024</dc:creator>
      <dc:date>2026-01-04T16:58:31Z</dc:date>
    </item>
  </channel>
</rss>

