I originally set the Storage location in my DLT as abfss://{container}@{storageaccount}.dfs.core.windows.net/...

But when running the DLT I got the following error:

So I decided to leave the above Storage location blank and define the path parameter in @Dlt.table instead:

In doing so DLT runs fine and I can even see the location of the files in the path above, which I can also read from a notebook:

But when I go over to SQL Editor and use the Serverless Starter Warehouse cluster, I can't access the tables:

I know it's probably something to do with not running spark.conf.set("fs.azure.account..."), but how do I get round that? It'd also be nice to not have to run those lines in all my notebook, guessing there's a way to add them to the cluster configuration or something?
Before suggesting to upgrade to Unity Catalog, that is indeed my plan but I want to be able to at least prove this works for Hive Metastore first.