- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 04:45 AM
Using Databricks Runtime 12.0, when attempting to mount an Azure blob storage container, I'm getting the following exception:
`IllegalArgumentException: Unsupported Azure Scheme: abfss`
dbutils.fs.mount(
source="abfss://container@my-storage-account.dfs.core.windows.net/",
mount_point="/mnt/my-mount-point"
)
I can succesfully list files using `dbutils.fs.ls`, it's just the mounting that does not work.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 05:51 AM
it is not activated by default.
I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1
Do you pass all the necessary configuration as explained in the help?
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": <SP id>,
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage.dfs.core.windows.net",
mount_point = "/mnt/mount",
extra_configs = configs)
I use databricks secrets to fetch a secret from Azure Key Vault.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 05:25 AM
don't you have to pass some config like fs.azure.account.auth.type etc?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 05:30 AM
I have done that, following https://docs.databricks.com/dbfs/mounts.html#mount-adls-gen2-or-blob-storage-with-abfs. Without credentials configs I get other errors.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 05:37 AM
you did not activate Unity by any chance?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 05:39 AM
Not as far as I can tell, unless it's activated by default or something
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 05:51 AM
it is not activated by default.
I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1
Do you pass all the necessary configuration as explained in the help?
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": <SP id>,
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage.dfs.core.windows.net",
mount_point = "/mnt/mount",
extra_configs = configs)
I use databricks secrets to fetch a secret from Azure Key Vault.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 06:26 AM
You are right! I tweaked some configs and it works now. The exception message was confusing.
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-13-2023 09:17 AM
What configs did you tweak, having same issue?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-03-2024 06:09 PM
In my case, I copied the same parameters used to access ADLS using a service principal, so client_id, client_secret and client_endpoint had ".<storage-account>.dfs.core.windows.net" in it. When removed, the code worked properly.