Hi @bohemiaRDX , Greetings!
Generally, this error would occur if the path is not added as an external location with storage credentials. Here the cluster could be trying to access the storage which doesn’t have UC storage credentials set nor any non UC access mechanisms like access keys/service principal setup and eventually failing with the given error.
What’s Next:
- Verify if the given storage location needs to be accessed via unity catalog or not.
- If the usage is for unity catalog,
- If the usage is non UC based, ensure that relevant access configuration are in place. It can be set using OAuth 2.0 with an Azure service principal/SAS token/Account keys. These are usually set at notebook level/cluster spark configuration via UI/init script. For SQL warehouse, these are set in SQL warehouse settings for data access configuration.
Refer to https://learn.microsoft.com/en-us/azure/databricks/storage/azure-storage#connect-to-azure-data-lake-... for more details.
Leave a like if this helps, followups are appreciated.
Kudos,
Ayushi