Accessing blob from databricks 403 Error Request Not authorized
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi,
I am trying to access a blob storage container to retrieve files. It's throwing this error.
: Operation failed: "This request is not authorized to perform this operation using this resource type.", 403, GET,
I have tried sas key at container and storage account levels
This is what i see as the parameters in my SAS token
sp=rli
&st=2025-01-13T11:28:48Z
&se=2026-05-31T19:28:48Z
&spr=https
&sv=2022-11-02
&sr=c
&sig=
Maybe there is someone else that can be suggested to make this work if it's not sas token settings
Does anyone have a screenshot of correct privileges to list and read files from the blob to make this work?
This is my code
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
#from azure.keyvault.secrets import SecretClient
#from azure.identity import DefaultAzureCredential
from pyspark.sql import SparkSession
from pyspark.sql import DataFrame
def list_blobs(spark: SparkSession):
files = dbutils.fs.ls("abfss://audits@storagedev1.blob.core.windows.net/")
display(files)
if __name__ == "__main__":
spark = SparkSession \
.builder \
.appName("Python Spark SQL data source example") \
.getOrCreate()
#Key1 = "Auditkey"
#Key2 = "auditkey2"
key_scope= "DataBricksDevSecrets"
key_secret = "auditkey2"
storage_account = "storagedev1"
fs_sas_fixed_string = "fs.azure.sas.fixed.token." + storage_account + ".blob.core.windows.net"
fs_auth_string = "fs.azure.account.auth.type."+ storage_account + ".blob.core.windows.net"
fs_sas_provider_string = "fs.azure.sas.token.provider.type."+ storage_account + ".blob.core.windows.net"
spark.conf.set(fs_auth_string, "SAS")
spark.conf.set(fs_sas_provider_string, "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set(fs_sas_fixed_string, dbutils.secrets.get(scope=key_scope, key=key_secret))
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
@turagittech Please check the below article for possible methods to connect to Azure Data Lake Storage Gen2 and Blob Storage
To set Spark properties, use the following snippet in a cluster’s Spark configuration or a notebook:
spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope>", key="<sas-token-key>"))
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
That delivers the following message:
: org.apache.hadoop.fs.FileAlreadyExistsException: Operation failed: "This endpoint does not support BlobStorageEvents or SoftDelete. Please disable these account features if you would like to use this endpoint.", 409, GET
Now can someone explain why abfss only shows examples using the dfs and does SFS need hierarchical data file systems enabled?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
@turagittech Are you able to access the Blob storage after disabling soft delete?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
No, I also regenerated the sas key. Now getting this error
: Operation failed: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.", 403, GET
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
can you try this for specific container?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
I have tested that and no improvement. I have also tried with a Service principal to see if another error message occurred to find the issue.
I do have a question, must we use the dfs url to access blob soreage with abfss? Is that only enabled when you configure heirarchical file system? Or is ther away to enable it without? I am not resolving the dfs host in dns.
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)