02-09-2023 04:45 AM
Using Databricks Runtime 12.0, when attempting to mount an Azure blob storage container, I'm getting the following exception:
`IllegalArgumentException: Unsupported Azure Scheme: abfss`
dbutils.fs.mount(
source="abfss://container@my-storage-account.dfs.core.windows.net/",
mount_point="/mnt/my-mount-point"
)
I can succesfully list files using `dbutils.fs.ls`, it's just the mounting that does not work.
02-09-2023 05:51 AM
it is not activated by default.
I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1
Do you pass all the necessary configuration as explained in the help?
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": <SP id>,
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage.dfs.core.windows.net",
mount_point = "/mnt/mount",
extra_configs = configs)
I use databricks secrets to fetch a secret from Azure Key Vault.
02-09-2023 05:25 AM
don't you have to pass some config like fs.azure.account.auth.type etc?
02-09-2023 05:30 AM
I have done that, following https://docs.databricks.com/dbfs/mounts.html#mount-adls-gen2-or-blob-storage-with-abfs. Without credentials configs I get other errors.
02-09-2023 05:37 AM
you did not activate Unity by any chance?
02-09-2023 05:39 AM
Not as far as I can tell, unless it's activated by default or something
02-09-2023 05:51 AM
it is not activated by default.
I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1
Do you pass all the necessary configuration as explained in the help?
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": <SP id>,
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage.dfs.core.windows.net",
mount_point = "/mnt/mount",
extra_configs = configs)
I use databricks secrets to fetch a secret from Azure Key Vault.
02-09-2023 06:26 AM
You are right! I tweaked some configs and it works now. The exception message was confusing.
Thanks!
03-13-2023 09:17 AM
What configs did you tweak, having same issue?
01-03-2024 06:09 PM
In my case, I copied the same parameters used to access ADLS using a service principal, so client_id, client_secret and client_endpoint had ".<storage-account>.dfs.core.windows.net" in it. When removed, the code worked properly.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group