- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2024 11:01 PM
I have a multi-tenant Azure app. I am using this app's credentials to read ADLS container files from Databricks cluster using PySpark dataframe.
I need to set this 'additionallyAllowedTenants' flag value to '*' or a specific tenant_id of the multi-tenant app in the databricks cluster config or PySpark session.
In Python I can achieve this by executing the below line of code:
default_credential = DefaultAzureCredential(additionally_allowed_tenants=['*'])
My query is, how can I achieve the same in Databricks config or PySpark session. Any leads are appreciated.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-30-2024 10:36 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-30-2024 10:36 PM
Update: Currently spark does not have this feature.

