<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: wheel package to install in a serveless workflow in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105457#M42134</link>
    <description>&lt;P&gt;Oh I see above error&amp;nbsp;&lt;SPAN&gt;Python: 3.10.12 not in '&amp;lt;4.0,&amp;gt;=3.11' and I just tested it and indeed using 3.10, let me check&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 13 Jan 2025 17:08:21 GMT</pubDate>
    <dc:creator>Alberto_Umana</dc:creator>
    <dc:date>2025-01-13T17:08:21Z</dc:date>
    <item>
      <title>wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105404#M42110</link>
      <description>&lt;P&gt;Hi guys,&amp;nbsp;&lt;BR /&gt;Which is the way through Databricks Asset Bundle to declare a new job definition having a serveless compute associated on each task that composes the workflow and be able that inside each notebook task definition is possible to catch the dependent custom libraries that I imported inside the workspace?&lt;BR /&gt;&lt;BR /&gt;I did something like this:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;      environments:
      - environment_key: envir
        spec:
          client: "1"
          dependencies:
            - "${workspace.root_path}/artifacts/.internal/data_pipelines-0.0.1-py3-none-any.whl"

      tasks:

        - task_key: schedule_next_run_for_this_job
          description: due to business requirements is needed to reschedule the workflow in the near next run
          environment_key: envir
          notebook_task:
            notebook_path: ../notebook/jobs/export.py
            base_parameters:
              function: schedule_next_run_for_this_job
              env: ${bundle.target}
              job_id: "{{job.id}}"
              workspace_url: "{{workspace.url}}"&lt;/LI-CODE&gt;&lt;P&gt;but it returns to me:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Error: cannot create job: A task environment can not be provided for notebook task get_email_infos. Please use the %pip magic command to install notebook-scoped Python libraries and Python wheel packages&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;the only way to import personal wheel package inside a serveless compute is to install inside the notebook that library?&lt;BR /&gt;&lt;BR /&gt;Because I want to do something like using:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;libraries:
   - whl: ...&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 13:40:26 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105404#M42110</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T13:40:26Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105412#M42111</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/133094"&gt;@jeremy98&lt;/a&gt;,&lt;/P&gt;
&lt;P class="p1"&gt;It appears that you are trying to use an environment block to specify dependencies for a notebook task, but this approach is not supported for notebook tasks on serverless compute. Instead, you should use the %pip magic command within the notebook to install the required libraries.&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL class="ul1"&gt;
&lt;LI class="li1"&gt;Create the job definition with the necessary tasks. Each task should specify the notebook path and any parameters required&lt;/LI&gt;
&lt;LI class="li1"&gt;Use the %pip magic command inside each notebook to install the custom libraries. This ensures that the libraries are available in the notebook's environment when the task runs.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;Here’s an example:&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;bundle:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;name: my-bundle&lt;/P&gt;
&lt;P class="p1"&gt;resources:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;jobs:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;my-job:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;name: my-job&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;tasks:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;- task_key: schedule_next_run_for_this_job&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;description: due to business requirements is needed to reschedule the workflow in the near next run&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;notebook_task:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;notebook_path: /Workspace/Users/your_username/notebook/jobs/export.py&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;base_parameters:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;function: schedule_next_run_for_this_job&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;env: ${bundle.target}&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;job_id: "{{job.id}}"&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;workspace_url: "{{workspace.url}}"&lt;/P&gt;
&lt;P class="p1"&gt;targets:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;dev:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;default: true&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;resources:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;jobs:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;my-job:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;name: my-job&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;The example content of export.py:&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;# Install custom libraries using %pip magic command&lt;/P&gt;
&lt;P class="p1"&gt;%pip install /Workspace/Shared/Path/To/your_custom_library.whl&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;# Your notebook code here&lt;/P&gt;
&lt;P class="p1"&gt;def schedule_next_run_for_this_job():&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;# Function implementation&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;pass&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;# Call the function with parameters&lt;/P&gt;
&lt;P class="p1"&gt;schedule_next_run_for_this_job()&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 13:44:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105412#M42111</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T13:44:47Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105417#M42112</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;Thanks for this answer! But, any import code from the wheel package should be imported like this for example?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;from data_pipelines.core.utils.filters import (
    filter_by_time_granularity
)&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 13:51:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105417#M42112</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T13:51:47Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105420#M42113</link>
      <description>&lt;P class="p1"&gt;Yes, you can import code from a wheel package in your notebook just like you would with any other Python module. Once you have installed the wheel package using %pip, you can import the functions or classes from the package.&lt;/P&gt;
&lt;P class="p1"&gt;For example, if your wheel package contains a module data_pipelines.core.utils.filters and you want to import the filter_by_time_granularity function, you can do it as follows&lt;/P&gt;
&lt;P class="p1"&gt;%pip install /Workspace/Shared/Path/To/your_custom_library.whl&lt;/P&gt;
&lt;P class="p1"&gt;from data_pipelines.core.utils.filters import filter_by_time_granularity&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 13:58:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105420#M42113</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T13:58:40Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105421#M42114</link>
      <description>&lt;P&gt;Hi, mmm ok but how to upload a wheel package at every deployed dab? Because I did it in this way:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;artifacts:
  lib:
    type: whl
    build: poetry build
    path: .

sync:
  include:
    - ./dist/*.whl&lt;/LI-CODE&gt;&lt;P&gt;But this, will deploy the wheel package to my personal root_path:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;  stg:
    default: true
    workspace: 
      host: &amp;lt;host-id&amp;gt;
      root_path: /Workspace/Users/${workspace.current_user.userName}/.bundle/${bundle.name}/${bundle.target}&lt;/LI-CODE&gt;&lt;P&gt;how to specify that the wheel package needs to be uploaded every time in a shared location?&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 14:02:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105421#M42114</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T14:02:11Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105430#M42119</link>
      <description>&lt;P&gt;And another question, when doing the installation of the python wheel in the serveless compute is possible also to specify the version of python on the serveless compute? Because I tried to do it, but it says:&amp;nbsp;&lt;SPAN&gt;ERROR: Package 'data-pipelines' requires a different Python: 3.10.12 not in '&amp;lt;4.0,&amp;gt;=3.11'&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 14:27:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105430#M42119</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T14:27:27Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105432#M42120</link>
      <description>&lt;P&gt;You can define the artifact_path in the workspace mapping:&lt;/P&gt;
&lt;P&gt;This path should be a shared location accessible by all users who need to use the wheel package&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;bundle:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;name: my-bundle&lt;/P&gt;
&lt;P class="p1"&gt;artifacts:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;lib:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;type: whl&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;build: poetry build&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;path: .&lt;/P&gt;
&lt;P class="p1"&gt;sync:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;include:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;- ./dist/*.whl&lt;/P&gt;
&lt;P class="p1"&gt;workspace:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;artifact_path: /Workspace/Shared/Path/To/Shared/Location/.bundle/${bundle.name}/${bundle.target}&lt;/P&gt;
&lt;P class="p1"&gt;targets:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;stg:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;default: true&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;workspace:&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;host: &amp;lt;host-id&amp;gt;&lt;/P&gt;
&lt;P class="p1"&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;root_path: /Workspace/Shared/Path/To/Shared/Location/.bundle/${bundle.name}/${bundle.target}&lt;/P&gt;
&lt;P class="p1"&gt;&lt;STRONG&gt;artifact_path&lt;/STRONG&gt;: This specifies the path where the artifacts (wheel packages) will be stored in the workspace. By setting it to a shared location, you ensure that the wheel package is accessible to all users&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 14:37:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105432#M42120</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T14:37:02Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105433#M42121</link>
      <description>&lt;P&gt;About your second question,&amp;nbsp;it is not possible to specify the Python version directly during the installation of a python wheel, the serveless runtime would have the build-in python version and if we upgrade it or downgrade it it make break the system due to dependencies.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 14:38:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105433#M42121</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T14:38:40Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105436#M42124</link>
      <description>&lt;P&gt;Hi, thanks for your answers really helpful. But, this means that I should find a way to downgrade the python version specified in my pyproject.toml (and match it with all of my dependencies)? In order to be able to run the package in any serveless cluster?&lt;/P&gt;&lt;P&gt;Because, I don't know which python version I will have every time, right?&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 14:42:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105436#M42124</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T14:42:30Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105438#M42126</link>
      <description>&lt;P&gt;Hi, no problem! and serverless will use the latest DBR version mentioned here:&amp;nbsp;&lt;A href="https://docs.databricks.com/en/release-notes/serverless/index.html#version-154" target="_blank"&gt;https://docs.databricks.com/en/release-notes/serverless/index.html#version-154&lt;/A&gt;&amp;nbsp;based upon that the python version being used. In this case DBR 15.4 LTS which uses. So we need to refactor any dependencies to be compatible with that python verison, and keep on checking if any release update that comes with a different DBR/Python version&lt;/P&gt;
&lt;UL class="simple"&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Python&lt;/STRONG&gt;: 3.11.0&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Mon, 13 Jan 2025 15:08:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105438#M42126</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T15:08:18Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105448#M42128</link>
      <description>&lt;P&gt;Hi Alberto, thanks for the answer again, I don't understand your point. I mean you said that the actual cluster is working also for Python 3.11, but seems that when I was catching a new serverless cluster this hasn't a python 3.11 version but less. What do I need to do?&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 16:42:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105448#M42128</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T16:42:31Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105456#M42133</link>
      <description>&lt;P&gt;Hey Jeremy, serverless should be using 3.11 too, do you see a different version? serverless should pick DBR version 15.4 which using 3.11 based on&amp;nbsp;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/en/release-notes/serverless/index.html#version-154" target="_blank" rel="nofollow noopener noreferrer"&gt;https://docs.databricks.com/en/release-notes/serverless/index.html#version-154&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 17:06:44 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105456#M42133</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T17:06:44Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105457#M42134</link>
      <description>&lt;P&gt;Oh I see above error&amp;nbsp;&lt;SPAN&gt;Python: 3.10.12 not in '&amp;lt;4.0,&amp;gt;=3.11' and I just tested it and indeed using 3.10, let me check&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 17:08:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105457#M42134</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T17:08:21Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105460#M42136</link>
      <description>&lt;P&gt;I see the reason now, there are 2 versions of serverless one uses 1 - 3.10.12 and 2 uses the 3.11, please see:&amp;nbsp;&lt;A href="https://docs.databricks.com/en/release-notes/serverless/client-two.html" target="_blank"&gt;https://docs.databricks.com/en/release-notes/serverless/client-two.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Alberto_Umana_0-1736788376698.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14074i29CE704CBA2CC713/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Alberto_Umana_0-1736788376698.png" alt="Alberto_Umana_0-1736788376698.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 17:13:26 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105460#M42136</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T17:13:26Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105462#M42137</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;Thanks again for the answer :), ok, but Do I need to import environment field as I did before? Consider that I'm using DABs&lt;BR /&gt;&lt;BR /&gt;Like this?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;      environments: 
        - environment_key: env_for_data_pipelines_whl
          spec: 
            client: "2"&lt;/LI-CODE&gt;&lt;P&gt;edit: I did it before defining the tasks, in this way each task will inherit the environment client specific, but it isn't set.. still have the same problem&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 18:21:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105462#M42137</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T18:21:58Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105468#M42141</link>
      <description>&lt;P&gt;Hello &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/106294"&gt;@Alberto_Umana&lt;/a&gt;&amp;nbsp;, considers that outside the workflow I can install the library, when I ran the workflow through dabs I still got error:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;CalledProcessError: Command 'pip --disable-pip-version-check install '/Workspace/Shared/test-sync-lib/.internal/data_pipelines-0.0.1-py3-none-any.whl'' returned non-zero exit status 1.&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;and looking in detail the error still got:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;ERROR: Package 'data-pipelines' requires a different Python: 3.10.12 not in '&amp;lt;4.0,&amp;gt;=3.11'&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But sounds strange, since I wrote that environment field.. that will be inherit to each task automatically in theory.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 19:03:09 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105468#M42141</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-13T19:03:09Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105482#M42151</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/133094"&gt;@jeremy98&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;I think it has to do with the serverless version being used outside the workflow versus in DABs, since python version changes. please see:&amp;nbsp;&lt;A href="https://docs.databricks.com/en/release-notes/serverless/index.html" target="_blank"&gt;https://docs.databricks.com/en/release-notes/serverless/index.html&lt;/A&gt;&amp;nbsp;both the versions have different python versions which might cause dependencies issues. I am not sure how to specify the serverless version in DABs, I will check internally.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2025 19:44:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105482#M42151</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-13T19:44:10Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105533#M42172</link>
      <description>&lt;P&gt;Good morning,&lt;BR /&gt;Thanks for the answer, yes please let me know because I found the solution to declare it in the higher level but seems still that doesn't catch the environment inside each task, but If I look the task structure there, there is the environment set but doesn't work&lt;/P&gt;</description>
      <pubDate>Tue, 14 Jan 2025 08:01:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105533#M42172</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-14T08:01:29Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105555#M42182</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/106294"&gt;@Alberto_Umana&lt;/a&gt;, one of my colleague did it using a spark_python_task... maybe this is something only for certain types of files?&lt;/P&gt;</description>
      <pubDate>Tue, 14 Jan 2025 10:48:48 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105555#M42182</guid>
      <dc:creator>jeremy98</dc:creator>
      <dc:date>2025-01-14T10:48:48Z</dc:date>
    </item>
    <item>
      <title>Re: wheel package to install in a serveless workflow</title>
      <link>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105571#M42187</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/133094"&gt;@jeremy98&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;When you mentioned&amp;nbsp;&lt;SPAN&gt;using a spark_python_task did it work using serverless too?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 14 Jan 2025 12:44:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/wheel-package-to-install-in-a-serveless-workflow/m-p/105571#M42187</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-01-14T12:44:31Z</dc:date>
    </item>
  </channel>
</rss>

