filipniziol
Esteemed Contributor

Based on the below documentation you will not be able to do so:
https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-external-locations

Storage credential has 1-many relationship with external location.
In other words external location must have a storage credential.

filipniziol_0-1727983806008.png

Also, this article on creating STORAGE CREDENTIALS mentions extra requirements, for example the S3 bucket must be in the same region as the workspaces you want to access the data from, naming cannot contain dots, etc.:

https://docs.databricks.com/en/connect/unity-catalog/storage-credentials.html

Also, it makes sense not to allow public S3 buckets, because you need to be kind of owner of the cloud storage location, so that you can grant privileges on that location as part of UC catalog permission management. If it is public, then you do not have any control of it.