Failing to install a library from dbfs mounted storage (adls2) with pass through credentials cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-22-2022 01:33 PM
We've setup a premium workspace with passthrough credentials cluster , while they do work and access my adls gen 2 storage
I can't make it install a library on the cluster from there. and keeping getting
"Library installation attempted on the driver node of cluster 0522-200212-mib0srv0 and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: com.google.common.util.concurrent.UncheckedExecutionException: com.databricks.backend.daemon.data.client.adl.AzureCredentialNotFoundException: Could not find ADLS Gen2 Token
"
How can I actually install a "cluster wide" library? on those passthrough credentials clusters
(The general adls mount is using those credentials to mount the data lake)
This happens on both standard and high concurrency clusters
- Labels:
-
Credential passthrough
-
DBFS
-
Library
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2022 01:28 PM
Nope @Kaniz Fatma it's not actually my question, I know how to install a library on a cluster and do it quite a lot. the question is how to install a library stored on the data lake (via dbfs wheel) for a "pass credentials" cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2022 01:29 PM
I currently got a hack, of copying the library from the data lake to root dbfs. and from there . but I don't like it
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-25-2022 06:34 AM
Hi @Kaniz Fatma Even I am facing the same issue. I am trying to use job cluster with credential passthrough enabled to deploy a job but library installation fails with the same exception:
"Message: com.google.common.util.concurrent.UncheckedExecutionException: com.databricks.backend.daemon.data.client.adl.AzureCredentialNotFoundException: Could not find ADLS Gen2 Token"
Where to add the token? or am I missing something?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-25-2022 01:29 PM
@Nancy Gupta , As far as I can trace this issue, it's about the token isn't set up yet when the cluster is starting; I assume it does work with pass-through credentials after starting the collection regularly?
My hack was to copy the library to the root dbfs (I've created a new folder there) using another group, and then install from this place does work
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-26-2022 06:28 AM
@Kaniz Fatma , yes but that is just a workaround and it would be great if I can get a solution for this!
Also again in the job for any read from adls, it fails again with the same error.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-31-2022 04:08 AM
@Kaniz Fatma , any solutions pls?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2022 03:36 AM
Hello @Alon Nisser @Nancy Gupta
Installing libraries using passthrough credentials is currently not supported
You need below configs on the cluster
fs.azure.account...
We can file a feature request for this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2022 11:35 AM
Sorry I can't figure this out, the link you've added is irrelevant for passthrough credentials, if we add it the cluster won't be passthrough, Is there a way to add this just for a specific folder? while keeping passthrough for the rest?

