ADF & Databricks Pipeline Fails "Library installation failed for library due to user error"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-01-2023 10:06 AM
I'm working on small POC to create a data pipeline which get triggered from ADF while having some parameters from ADF but my pipeline fails showing the attaching error:
Operation on target Compute Daily Product Revenue failed: Databricks execution failed with error state: InternalError, error message: Library installation failed for library due to user error. Error messages: Java JARs must be stored in dbfs, s3, adls, gs, or as a workspace file/local file. Make sure the URI begins with 'dbfs:', 'file:', 's3:', 'abfss:', 'gs:', or '/Workspace'
Qasim Hassan
Data Engineer @Royal Cyber
Databricks UG Pakistan Lead
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2023 10:32 PM
The error message you encountered indicates that the library installation in your Databricks cluster failed due to an incorrect URI or location for the Java JARs. The message also provides guidance on the acceptable URI formats.
To resolve this issue, you need to ensure that the URI for the library files meets the requirements specified in the error message. Here are a few steps you can follow:
-
Verify the URI format: Double-check that the URI you provided for the Java JARs begins with one of the supported prefixes mentioned in the error message. The acceptable prefixes include 'dbfs:', 'file:', 's3:', 'abfss:', 'gs:', or '/Workspace'.
-
Check the file location: Make sure that the JAR files are stored in one of the supported locations mentioned in the error message. The supported locations include DBFS (Databricks File System), S3 (Amazon Simple Storage Service), ADLS (Azure Data Lake Storage), GS (Google Cloud Storage), or as a workspace file/local file.
-
Update the URI if necessary: If the current URI is not in the correct format or is not stored in one of the supported locations, update the URI accordingly. For example, if the JAR files are stored in DBFS, ensure that the URI begins with 'dbfs:'.
-
Retry library installation: After making the necessary adjustments to the URI, retry installing the library in your Databricks cluster. Monitor the installation process for any further errors or issues.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-11-2023 09:50 PM
Hi @qasimhassan
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!

