cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to access external table created by DLT

tommyhmt
New Contributor II

I originally set the Storage location in my DLT as abfss://{container}@{storageaccount}.dfs.core.windows.net/...

image.png

But when running the DLT I got the following error:

image.png

So I decided to leave the above Storage location blank and define the path parameter in @Dlt.table instead:

image.png

In doing so DLT runs fine and I can even see the location of the files in the path above, which I can also read from a notebook:

image.png

But when I go over to SQL Editor and use the Serverless Starter Warehouse cluster, I can't access the tables:

image.png

I know it's probably something to do with not running spark.conf.set("fs.azure.account..."), but how do I get round that? It'd also be nice to not have to run those lines in all my notebook, guessing there's a way to add them to the cluster configuration or something?

Before suggesting to upgrade to Unity Catalog, that is indeed my plan but I want to be able to at least prove this works for Hive Metastore first.

1 REPLY 1

brockb
Databricks Employee
Databricks Employee

Hi @Tommy ,

Thanks for your question.

I would encourage you to verify once using a Pro SQL Warehouse temporarily instead of a Serverless SQL Warehouse given the compute differences between the two - Pro compute resides in your data plane, Serverless compute is Databricks-managed. If it works as expected using a Pro Warehouse, there is a good indication that there is an issue with the network path to the Databricks-managed Serverless compute. If that is concluded, you can use docs such as this to guide you further on the Serverless setup: https://learn.microsoft.com/en-us/azure/databricks/admin/sql/serverless.

Additionally, Workspace-level SQL Warehouse configurations can be managed by a Workspace Admin via:

  • Settings / Workspace admin / Compute / and then clicking "Manage" next to "SQL warehouses and serverless compute".
  • This would be where you'd managed configs such as the `spark.conf.set("fs.azure.account...")`, as applicable, for all Warehouses in the workspace

Hope this helps

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now