โ05-22-2022 01:33 PM
We've setup a premium workspace with passthrough credentials cluster , while they do work and access my adls gen 2 storage
I can't make it install a library on the cluster from there. and keeping getting
"Library installation attempted on the driver node of cluster 0522-200212-mib0srv0 and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: com.google.common.util.concurrent.UncheckedExecutionException: com.databricks.backend.daemon.data.client.adl.AzureCredentialNotFoundException: Could not find ADLS Gen2 Token
"
How can I actually install a "cluster wide" library? on those passthrough credentials clusters
(The general adls mount is using those credentials to mount the data lake)
This happens on both standard and high concurrency clusters
โ05-24-2022 01:45 AM
Hi @Alon Nisserโ , Here is a similar issue on S.O. Please let us know if that helps.
โ05-24-2022 01:28 PM
Nope @Kaniz Fatmaโ it's not actually my question, I know how to install a library on a cluster and do it quite a lot. the question is how to install a library stored on the data lake (via dbfs wheel) for a "pass credentials" cluster
โ05-25-2022 01:28 AM
Hi @Alon Nisserโ , Thank you for the clarification. I might have misinterpreted the question.
โ05-24-2022 01:29 PM
I currently got a hack, of copying the library from the data lake to root dbfs. and from there . but I don't like it
โ05-25-2022 04:14 AM
Hi @Alon Nisserโ , I'm glad you got a hack for the time being, and thank you for sharing it on our platform. Can you tell us the reason for your dissatisfaction with it?
โ05-25-2022 06:34 AM
Hi @Kaniz Fatmaโ Even I am facing the same issue. I am trying to use job cluster with credential passthrough enabled to deploy a job but library installation fails with the same exception:
"Message: com.google.common.util.concurrent.UncheckedExecutionException: com.databricks.backend.daemon.data.client.adl.AzureCredentialNotFoundException: Could not find ADLS Gen2 Token"
Where to add the token? or am I missing something?
โ05-25-2022 01:29 PM
@Nancy Guptaโ , As far as I can trace this issue, it's about the token isn't set up yet when the cluster is starting; I assume it does work with pass-through credentials after starting the collection regularly?
My hack was to copy the library to the root dbfs (I've created a new folder there) using another group, and then install from this place does work
โ05-26-2022 02:26 AM
Hi @Nancy Guptaโ , Were you able to replicate the solution provided by @Alon Nisserโ ?
โ05-26-2022 06:28 AM
@Kaniz Fatmaโ , yes but that is just a workaround and it would be great if I can get a solution for this!
Also again in the job for any read from adls, it fails again with the same error.
โ05-31-2022 04:08 AM
@Kaniz Fatmaโ , any solutions pls?
โ05-31-2022 06:16 AM
Hi @Nancy Guptaโ,
By design, it is a limitation that the ADF-linked service access token will not be passed through the notebook activity. It would help if you used the credentials inside the notebook activity or key vault store.
Reference: ADLS using AD credentials passthrough โ limitations.
Hope this helps. Do let us know if you any further queries.
โ06-14-2022 09:41 AM
Hi @Nancy Guptaโ โ, We havenโt heard from you on the last response from me, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.
โ06-16-2022 03:36 AM
Hello @Alon Nisserโ @Nancy Guptaโ
Installing libraries using passthrough credentials is currently not supported
You need below configs on the cluster
fs.azure.account...
We can file a feature request for this.
โ06-16-2022 11:35 AM
Sorry I can't figure this out, the link you've added is irrelevant for passthrough credentials, if we add it the cluster won't be passthrough, Is there a way to add this just for a specific folder? while keeping passthrough for the rest?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group