cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to install JAR libraries from ADLS? I'm having an error

oussamak
New Contributor II

I mounted the ADLS to my Azure Databricks resource and I keep on getting this error when I try to install a JAR from a container:

Library installation attempted on the driver node of cluster 0331-121709-buk0nvsq and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: java.util.concurrent.ExecutionException: Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key

This is the path I provide to Databricks: abfss://libraries@nespdatalakeadls2.dfs.core.windows.net/libraries/tensorflow-1.15.0.jar

1 REPLY 1

Hi @Kaniz Fatmaโ€‹,

Thank your reply. I followed the steps in the link to mount the ADLS from the notebook but I'm getting an error :

Operation failed: "This request is not authorized to perform this operation.", 403, HEAD, https://myadls2.dfs.core.windows.net/libraries/?upn=false&action=getAccessControl&timeout=90
ExecutionError                            Traceback (most recent call last)
<command-274923613237744> in <module>
      5           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/12a3af23-a769-4654-847f-958f3d479f4a/oauth2/token"}
      6 
----> 7 dbutils.fs.mount(
      8   source = "abfss://libraries@myadls2.dfs.core.windows.net/",
      9   mount_point = "/mnt/libraries",
 
/databricks/python_shell/dbruntime/dbutils.py in f_with_exception_handling(*args, **kwargs)
    387                     exc.__context__ = None
    388                     exc.__cause__ = None
--> 389                     raise exc
    390 
    391             return f_with_exception_handling
 
ExecutionError: An error occurred while calling o360.mount.
: Operation failed: "This request is not authorized to perform this operation.", 403, HEAD, https://myadls2.dfs.core.windows.net/libraries/?upn=false&action=getAccessControl&timeout=90
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:246)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getAclStatus(AbfsClient.java:955)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getAclStatus(AbfsClient.java:937)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getIsNamespaceEnabled(AzureBlobFileSystemStore.java:318)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:883)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:682)
	at com.databricks.backend.daemon.dbutils.DBUtilsCore.verifyAzureFileSystem(DBUtilsCore.scala:824)
	at com.databricks.backend.daemon.dbutils.DBUtilsCore.createOrUpdateMount(DBUtilsCore.scala:734)
	at com.databricks.backend.daemon.dbutils.DBUtilsCore.mount(DBUtilsCore.scala:776)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
	at py4j.Gateway.invoke(Gateway.java:295)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:251)
	at java.lang.Thread.run(Thread.java:748)

FYI: The service principal has the following roles on the adls : storage data reader, storage blob data contributor, storage blob data owner, owner.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group