โ08-07-2022 10:40 AM
I have tried many times all the answers from the internet and stackover flow
I have already created the config section before this steps, it passed but this below step is not executing.
โ08-07-2022 06:26 PM
isnt the sas token meant to be passed in as part of the URL? I can't see it in your example
https://docs.microsoft.com/en-us/azure/storage/common/media/storage-sas-overview/sas-storage-uri.png
https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview
โ08-08-2022 06:02 AM
@santh sโ , Could you please try to use the steps mentioned below in a notebook (Data location, type and account access key can be set before reading the data) :
https://docs.databricks.com/_static/notebooks/data-import/azure-blob-store.html
โ09-07-2022 05:29 AM
Hi @santh sโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
โ01-10-2024 06:52 AM
We were getting this problem when using directory-scoped SAS tokens. While I know there are a number of potential issues that can cause this problem, one potential explanation is that it turns out there is an undocumented spark setting needed on the cluster to enable directory-scoped SAS tokens. In your cluster's Spark config, add the following:
spark.hadoop.fs.azure.account.hns.enabled true
That solved it for us.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group