cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Unsupported Azure Scheme: abfss

vroste
New Contributor III

Using Databricks Runtime 12.0, when attempting to mount an Azure blob storage container, I'm getting the following exception:

`IllegalArgumentException: Unsupported Azure Scheme: abfss`

dbutils.fs.mount(
    source="abfss://container@my-storage-account.dfs.core.windows.net/",
    mount_point="/mnt/my-mount-point"
  )

I can succesfully list files using `dbutils.fs.ls`, it's just the mounting that does not work.

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

it is not activated by default.

I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1

Do you pass all the necessary configuration as explained in the help?

configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": <SP id>,
           "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
  
dbutils.fs.mount(
  source = "abfss://container@storage.dfs.core.windows.net",
  mount_point = "/mnt/mount",
  extra_configs = configs)

 I use databricks secrets to fetch a secret from Azure Key Vault.

View solution in original post

8 REPLIES 8

-werners-
Esteemed Contributor III

don't you have to pass some config like fs.azure.account.auth.type etc?

vroste
New Contributor III

I have done that, following https://docs.databricks.com/dbfs/mounts.html#mount-adls-gen2-or-blob-storage-with-abfs. Without credentials configs I get other errors.

-werners-
Esteemed Contributor III

you did not activate Unity by any chance?

vroste
New Contributor III

Not as far as I can tell, unless it's activated by default or something

-werners-
Esteemed Contributor III

it is not activated by default.

I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1

Do you pass all the necessary configuration as explained in the help?

configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": <SP id>,
           "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
  
dbutils.fs.mount(
  source = "abfss://container@storage.dfs.core.windows.net",
  mount_point = "/mnt/mount",
  extra_configs = configs)

 I use databricks secrets to fetch a secret from Azure Key Vault.

vroste
New Contributor III

You are right! I tweaked some configs and it works now. The exception message was confusing.

Thanks!

AdamRink
New Contributor III

What configs did you tweak, having same issue?

Adriano
New Contributor II

In my case, I copied the same parameters used to access ADLS using a service principal, so client_id, client_secret and client_endpoint had ".<storage-account>.dfs.core.windows.net" in it. When removed, the code worked properly.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.