Connect to Blob storage "no credentials found for them in the configuration"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2018 09:56 AM
I'm working with Databricks notebook backed by spark cluster. Having trouble trying to connect to the Azure blob storage. I used this link and tried the section Access Azure Blob Storage Directly - Set up an account access key. I get no errors here:
spark.conf.set( "fs.azure.account.key.<your-storage-account-name>.blob.core.windows.net", "<your-storage-account-access-key>")
But receive errors when I try and do an 'ls' on the directory:
dbutils.fs.ls("wasbs://<your-container-name>@<your-storage-account-name>.blob.core.windows.net/<your-directory-name>")
shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Unable to access container <container name> in account <storage account name>core.windows.net using anonymous credentials, and no credentials found for them in the configuration.
- Labels:
-
Azure databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-12-2018 10:10 AM
I get the same error when trying to follow this DataBricks example. It doesn't seem to be using the access credentials correctly, but I don't know how to fix this problem.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-19-2019 09:15 AM
Happened to me as well and I discovered that the storage and the databricks cluster were in different region, so I moved the storage. This fixed the issue and now it works!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-26-2019 03:43 AM
I had the same problem, but was connecting via RDD API (I was using my own java adaptor to azure blobs). The https://docs.databricks.com/spark/latest/data-sources/azure/azure-storage.html page states at the bottom that you should create your cluster with hadoop config containing the credentials - either access key or SAS token.
I tried with the access key (I'm just experimenting) and that started to work just fine.
@annashetty you might want to try the same.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2021 09:09 PM
I had this problem bc I copy-pasted the header key and did not replace the one <storage-account-name> to the other. That's my fault but... it is a very long header key name if your storage account name is kinda generic.
fs.azure.account.key.<STORAGE-ACCOUNT-NAME>.blob.core.windows.net
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-27-2021 07:25 AM
I have been facing the same problem over and over. Now trying to follow what's written here (https://docs.databricks.com/data/data-sources/azure/azure-storage.html#access-azure-blob-storage-directly), but always getting "shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container containerfede in account storageaccountfede.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration". Both my cluster and blob storage are located in the same place, so I am assuming that it must be the cluster's configuration.

