cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks Azure Blob Storage access

tariq
New Contributor III

I am trying to access files stored in Azure blob storage and have followed the documentation linked below:

https://docs.databricks.com/external-data/azure-storage.html

I was successful in mounting the Azure blob storage on dbfs but it seems that the method is not recommended anymore. So, I tried to set up direct access using URI after using SAS authentication.

spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", "<token>")

Now when I try to access any file using:

spark.read.load("abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/<path-to-data>")

I get the following error:

Operation failed: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.", 403, HEAD,

What needs to be changed for this to work?

5 REPLIES 5

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi @Tarique Anwarโ€‹ , could you please refer to https://learn.microsoft.com/en-us/answers/questions/334786/azure-blob-storage-fails-to-authenticate-...

Please let us know if this helps?

Anonymous
Not applicable

Hi @Tarique Anwarโ€‹ 

Does @Debayan Mukherjeeโ€‹  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

We'd love to hear from you.

Thanks!

ravinchi
New Contributor III

@Debayan Mukherjeeโ€‹ @Tarique Anwarโ€‹ @Vidula Khannaโ€‹ 

Even we're facing this issue while creating table from databricks sql.

create table test using delta location 'abfss://[container_name]@[storage_account].dfs.core.windows.net/'

We created external_location, storage_credentail with access_connector id. And provided storage blob data contributor role for access conenctor id. It was working until day before yesterday, not from yesterday. Getting below error

"AuthenticationFailed, "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature"

I followed the link shared- https://learn.microsoft.com/en-us/answers/questions/334786/azure-blob-storage-fails-to-authenticate-... along with the stack over flow - https://stackoverflow.com/questions/24492790/azurestorage-blob-server-failed-to-authenticate-the-req... which is pointing but still no luck.

if we have to control api version in header, isn't it something in control of databricks? as we're using databricks sql.

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, @Ravindra Chโ€‹ , could you please check the firewall settings in Azure networking?

ravinchi
New Contributor III

Firewall settings are fine, and its allowed from all networks @Debayan Mukherjeeโ€‹