Hi guys,
I'm running a streamReader/Writer with autoloader from StorageV2 (general purpose v2) over abfss instead of wasbs. My checkpoint location is valid, the reader properly reads the file schema and autoloader is able to sample 105 files to do so.
I have a valid SAS token with all permissions set, the storage is not behind a firewall and is open to access from all networks. However, whenever I try to access the storage location with abfss I get the following error:
(shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.AbfsRestOperationException) Operation failed: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.", 403
The SAS token is set like this:
spark.conf.set("fs.azure.account.auth.type.<storage_account>.dfs.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage_account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage_account>.dfs.core.windows.net", <sas_token>)
I am currently running: 15.4 LTS (includes Apache Spark 3.5.0, Scala 2.12) with Azure Data Lake Storage credential passthrough enabled. Soft blob delete is disabled on the storage account and the SAS token has all the possible permissions.
The same operation and setup work with wasbs, leaving me wondering what could be the possible reasons and how to fix them. If anyone encountered this issue or knows how to solve it without using Azure Service Principal, I would appreciate the help. I've spent way too much time on this with no real solution.