05-22-2022 01:33 PM
We've setup a premium workspace with passthrough credentials cluster , while they do work and access my adls gen 2 storage
I can't make it install a library on the cluster from there. and keeping getting
"Library installation attempted on the driver node of cluster 0522-200212-mib0srv0 and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: com.google.common.util.concurrent.UncheckedExecutionException: com.databricks.backend.daemon.data.client.adl.AzureCredentialNotFoundException: Could not find ADLS Gen2 Token
"
How can I actually install a "cluster wide" library? on those passthrough credentials clusters
(The general adls mount is using those credentials to mount the data lake)
This happens on both standard and high concurrency clusters
05-24-2022 01:28 PM
Nope @Kaniz Fatma it's not actually my question, I know how to install a library on a cluster and do it quite a lot. the question is how to install a library stored on the data lake (via dbfs wheel) for a "pass credentials" cluster
05-24-2022 01:29 PM
I currently got a hack, of copying the library from the data lake to root dbfs. and from there . but I don't like it
05-25-2022 06:34 AM
Hi @Kaniz Fatma Even I am facing the same issue. I am trying to use job cluster with credential passthrough enabled to deploy a job but library installation fails with the same exception:
"Message: com.google.common.util.concurrent.UncheckedExecutionException: com.databricks.backend.daemon.data.client.adl.AzureCredentialNotFoundException: Could not find ADLS Gen2 Token"
Where to add the token? or am I missing something?
05-25-2022 01:29 PM
@Nancy Gupta , As far as I can trace this issue, it's about the token isn't set up yet when the cluster is starting; I assume it does work with pass-through credentials after starting the collection regularly?
My hack was to copy the library to the root dbfs (I've created a new folder there) using another group, and then install from this place does work
05-26-2022 06:28 AM
@Kaniz Fatma , yes but that is just a workaround and it would be great if I can get a solution for this!
Also again in the job for any read from adls, it fails again with the same error.
05-31-2022 04:08 AM
@Kaniz Fatma , any solutions pls?
06-16-2022 03:36 AM
Hello @Alon Nisser @Nancy Gupta
Installing libraries using passthrough credentials is currently not supported
You need below configs on the cluster
fs.azure.account...
We can file a feature request for this.
06-16-2022 11:35 AM
Sorry I can't figure this out, the link you've added is irrelevant for passthrough credentials, if we add it the cluster won't be passthrough, Is there a way to add this just for a specific folder? while keeping passthrough for the rest?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group