cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How do you define PyPi libraries on job level in Asset Bundles?

jacovangelder
Contributor III

Hello,

Reading the documentation, it does not state it is possible to define libraries on job level instead of on task level. It feels really counter-intuitive putting libraries on task level in Databricks workflows provisioned by Asset Bundles. Is there a way to put libraries on job level some other way?

I tried the following:

tasks:
  - task_key: task1
    job_cluster_key: job_cluster
    notebook_task:
      notebook_path: ../foo.ipynb
  - task_key: task2
    depends_on:
        - task_key: task1
    job_cluster_key: job_cluster
    notebook_task:
      notebook_path: ../foo.ipynb
libraries:
- pypi:
    package: pyyaml==6.0.1
- pypi:
    package: requests==2.31.0
- pypi:
    package: typing_extensions==4.4.0

Validating the DAB it does not fail, but doesn't work either. It only works when I put the libraries object on the task key level, which feels weird to me. Does that mean we can have different libraries installed for each task? The documentation doesn't really shine any light on this. I can define all libraries on the first task, then I guess the second task will inherit them also, but this feels weird. 

I know from DBR 15.x and onwards we can use a requirements.txt workspace, but I am on DBR 14.3 LTS.

I hope someone is able to shine some light on this? 

0 REPLIES 0
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!