cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Accessing blob from databricks 403 Error Request Not authorized

turagittech
New Contributor II

Hi,

I am trying to access a blob storage container to retrieve files. It's throwing this error.

: Operation failed: "This request is not authorized to perform this operation using this resource type.", 403, GET,

I have tried sas key at container and storage account levels

This is what i see as the parameters in my SAS token

 

sp=rli
&st=2025-01-13T11:28:48Z
&se=2026-05-31T19:28:48Z
&spr=https
&sv=2022-11-02
&sr=c
&sig=

Maybe there is someone else that can be suggested to make this work if it's not sas token settings

Does anyone have a screenshot of correct privileges to list and read files from the blob to make this work?

This is my code

from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
#from azure.keyvault.secrets import SecretClient
#from azure.identity import DefaultAzureCredential
from pyspark.sql import SparkSession
from pyspark.sql import DataFrame

def list_blobs(spark: SparkSession):
    files = dbutils.fs.ls("abfss://audits@storagedev1.blob.core.windows.net/")
    display(files)
if __name__ == "__main__":
    spark = SparkSession \
        .builder \
        .appName("Python Spark SQL data source example") \
        .getOrCreate()
    #Key1 = "Auditkey"
    #Key2 = "auditkey2"
    key_scope= "DataBricksDevSecrets"
    key_secret = "auditkey2"
    storage_account = "storagedev1"

    fs_sas_fixed_string = "fs.azure.sas.fixed.token." + storage_account + ".blob.core.windows.net"
    fs_auth_string = "fs.azure.account.auth.type."+ storage_account + ".blob.core.windows.net"
    fs_sas_provider_string = "fs.azure.sas.token.provider.type."+ storage_account + ".blob.core.windows.net"

    spark.conf.set(fs_auth_string, "SAS")
    spark.conf.set(fs_sas_provider_string, "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
    spark.conf.set(fs_sas_fixed_string, dbutils.secrets.get(scope=key_scope, key=key_secret))
6 REPLIES 6

Allia
Databricks Employee
Databricks Employee

 @turagittech Please check the below article for possible methods to connect to Azure Data Lake Storage Gen2 and Blob Storage

https://docs.databricks.com/en/connect/storage/azure-storage.html#set-spark-properties-to-configure-...

To set Spark properties, use the following snippet in a cluster’s Spark configuration or a notebook:

spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")

spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")

spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope>", key="<sas-token-key>"))

Alliak

turagittech
New Contributor II

That delivers the following message: 

: org.apache.hadoop.fs.FileAlreadyExistsException: Operation failed: "This endpoint does not support BlobStorageEvents or SoftDelete. Please disable these account features if you would like to use this endpoint.", 409, GET

 

Now can someone explain why abfss only shows examples using  the dfs and does SFS need hierarchical data file systems enabled?

Allia
Databricks Employee
Databricks Employee

@turagittech Are you able to access the Blob storage after disabling soft delete?

Alliak

turagittech
New Contributor II

No, I also regenerated the sas key. Now getting this error

: Operation failed: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.", 403, GET

saurabh18cs
Valued Contributor III

can you try this for specific container?

# Configure the Spark session to use the SAS key for Data Lake Storage Gen2
spark.conf.set(
    f"fs.azure.sas.{container_name}.{storage_account_name}.dfs.core.windows.net",
    sas_key
)

turagittech
New Contributor II

I have tested that and no improvement. I have also tried with a Service principal to see if another error message occurred to find the issue.

I do have a question, must we use the dfs url to access blob soreage with abfss? Is that only enabled when you configure heirarchical file system? Or is ther away to enable it without? I am not resolving the dfs host in dns.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group