cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unsupported Azure Scheme: abfss

vroste
New Contributor III

Using Databricks Runtime 12.0, when attempting to mount an Azure blob storage container, I'm getting the following exception:

`IllegalArgumentException: Unsupported Azure Scheme: abfss`

dbutils.fs.mount(
    source="abfss://container@my-storage-account.dfs.core.windows.net/",
    mount_point="/mnt/my-mount-point"
  )

I can succesfully list files using `dbutils.fs.ls`, it's just the mounting that does not work.

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

it is not activated by default.

I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1

Do you pass all the necessary configuration as explained in the help?

configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": <SP id>,
           "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
  
dbutils.fs.mount(
  source = "abfss://container@storage.dfs.core.windows.net",
  mount_point = "/mnt/mount",
  extra_configs = configs)

 I use databricks secrets to fetch a secret from Azure Key Vault.

View solution in original post

8 REPLIES 8

-werners-
Esteemed Contributor III

don't you have to pass some config like fs.azure.account.auth.type etc?

vroste
New Contributor III

I have done that, following https://docs.databricks.com/dbfs/mounts.html#mount-adls-gen2-or-blob-storage-with-abfs. Without credentials configs I get other errors.

-werners-
Esteemed Contributor III

you did not activate Unity by any chance?

vroste
New Contributor III

Not as far as I can tell, unless it's activated by default or something

-werners-
Esteemed Contributor III

it is not activated by default.

I still believe it has something to do with your config as I can mount and unmount without any issues on 12.1

Do you pass all the necessary configuration as explained in the help?

configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": <SP id>,
           "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = <SCOPE> , key = <KEY>),
           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<TENANT>/oauth2/token"}
  
dbutils.fs.mount(
  source = "abfss://container@storage.dfs.core.windows.net",
  mount_point = "/mnt/mount",
  extra_configs = configs)

 I use databricks secrets to fetch a secret from Azure Key Vault.

vroste
New Contributor III

You are right! I tweaked some configs and it works now. The exception message was confusing.

Thanks!

AdamRink
New Contributor III

What configs did you tweak, having same issue?

In my case, I copied the same parameters used to access ADLS using a service principal, so client_id, client_secret and client_endpoint had ".<storage-account>.dfs.core.windows.net" in it. When removed, the code worked properly.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group