- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-25-2022 08:03 AM
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-05-2022 08:57 PM
Hello @Nancy Gupta
Jobs passthrough is currently in private preview. Could you please reach out your Databricks Account representatives to help you with onboarding?
Or Please raise a support case with us, so we can assist you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-26-2022 08:14 AM
I found a documentation which mentions that Jobs are not supported, https://github.com/hurtn/datalake-ADLS-access-patterns-with-Databricks/blob/master/readme.md#Pattern...
Wanted to confirm if this holds true and this credential passthrough support is only for notebook?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-05-2022 08:57 PM
Hello @Nancy Gupta
Jobs passthrough is currently in private preview. Could you please reach out your Databricks Account representatives to help you with onboarding?
Or Please raise a support case with us, so we can assist you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2022 05:56 AM
This is utterly ridiculous that jobs are not supported with this as a standard, so am I left to execute jobs manually?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-16-2022 01:40 AM
Right now the feature seems to be public available. It is possible to schedule jobs with ADLS passthough enabled and do not have to provide service principal credentials.
However I ask myself how that works behind the scenses. When working interactively with notebooks the "passthrough" refers to the fact that the user is logged in and already available access_token can be used for access ADLS. Job clusters runs unattended and there is no access_token available...
Does anybody know which OAuth2 flow Databricks use in order to get the access_token of the job owner?

