cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Are Jobs not supported on cluster with Azure Data Lake Storage credential passthrough enabled cluster?

nancy_g
New Contributor III
1 ACCEPTED SOLUTION

Accepted Solutions

User16764241763
Honored Contributor

Hello @Nancy Guptaโ€‹ 

Jobs passthrough is currently in private preview. Could you please reach out your Databricks Account representatives to help you with onboarding?

Or Please raise a support case with us, so we can assist you.

View solution in original post

6 REPLIES 6

Kaniz
Community Manager
Community Manager

Hi @Nancy Guptaโ€‹ , There is a similar discussion here. Please let us know if this helps.

nancy_g
New Contributor III

I found a documentation which mentions that Jobs are not supported, https://github.com/hurtn/datalake-ADLS-access-patterns-with-Databricks/blob/master/readme.md#Pattern...

Wanted to confirm if this holds true and this credential passthrough support is only for notebook?

User16764241763
Honored Contributor

Hello @Nancy Guptaโ€‹ 

Jobs passthrough is currently in private preview. Could you please reach out your Databricks Account representatives to help you with onboarding?

Or Please raise a support case with us, so we can assist you.

Juan
New Contributor II

This is utterly ridiculous that jobs are not supported with this as a standard, so am I left to execute jobs manually?

Kaniz
Community Manager
Community Manager

Hi @Juan-Paul Hynekโ€‹, Please raise a support case with us so that we can assist you.

Rostislaw
New Contributor III

Right now the feature seems to be public available. It is possible to schedule jobs with ADLS passthough enabled and do not have to provide service principal credentials.

However I ask myself how that works behind the scenses. When working interactively with notebooks the "passthrough" refers to the fact that the user is logged in and already available access_token can be used for access ADLS. Job clusters runs unattended and there is no access_token available...

Does anybody know which OAuth2 flow Databricks use in order to get the access_token of the job owner?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.