Hi there,
I have used databricks asset bundles (DAB) to deploy workflows. For each job, I will create a job cluster and install external libraries by specifying libraries in each task, for example:
- task_key: my-task
job_cluster_key: my-cluster
notebook_task:
notebook_path: ../notebooks/my_notebook.ipynb
libraries:
- whl: /Workspace${workspace.file_path}/libraries/PyYAML-6.0.whl
- jar: /Workspace${workspace.file_path}/libraries/mongo-spark-connector_2.12-10.1.1-all.jar
For Python wheel file (whl), I can see the library PyYAML-6.0.whl is installed. However, for JAR file, it failed to install the library mongo-spark-connector_2.12-10.1.1-all.jar. For JAR file, I know that I can install it using Unity Catalog volumes, however I want to install all libraries from workspace.
From this document, it says that To add a JAR file to a job task, in libraries specify a jar mapping for each library to be installed. You can install a JAR from workspace files, Unity Catalog volumes, cloud object storage, or a local file path.
https://docs.databricks.com/en/dev-tools/bundles/library-dependencies.html
However, it is not working for JAR file when I use workspace to install JAR file.
Even when I created a cluster and tried to install JAR file by the following steps, it is not working:
1. Create a cluster using Databricks Runtime Version 14.3 LTS
2. Go to Libraries tab and click Install new button
3. In the popup, select Workspace and navigate to the workspace libraries folder
Then there are 2 libraries PyYAML-6.0.whl and mongo-spark-connector_2.12-10.1.1-all.jar. I can only select whl library while JAR library is not selectable.
Do you know is there any way we can install JAR files from workspace?