cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to consume Fabric Datawarehouse inside a Databricks notebook

javiomotero
New Contributor III

Hello,

I'm having a hard time figuring out (and finding the right  documentation) to be able to connect my databricks notebook to consume tables from a fabric datawarehouse. 

I've checked this, but seems to work only with onelake and this, but I'm not seeing how to use the JDBC/ODB driver to use the Entra authenticator.

Any hint into what I should be looking at?

Thank you

4 REPLIES 4

Walter_C
Databricks Employee
Databricks Employee

I will look into this, I was checking on Lakehouse Federation options but seems that Fabric is not included as supported database system

Walter_C
Databricks Employee
Databricks Employee

Our team has confirmed that it is not recommended to make this direct connection, as of now the recommended solution is to use ADF or FDF to copy that data out into ADLS and then govern that ADLS with UC

javiomotero
New Contributor III

Hi, thank you. There was some issues with my permissions (apparently to the Dataverse resource, not the Datalake) and that is why I was not able to query successfully using abfss. Just for the future, I was able to query directly the Datalake (not the endpoint) using

spark.read.format('delta').load('abfss://<workspace_id>@onelake.dfs.fabric.microsoft.com/<lakehouse_id>/Tables/dbo/<table_name>')

You can also write there and should show successfully in the endpoint unless this happens
This was quite useful documentation also 

 

javiomotero
New Contributor III

Hello, I would like to get a bit more options regarding reading Views. Using the abfss is fine for reading tables, but I don't know how to load Views, which are visible in the SQL Endpoint. Is there any alternative for connecting to Fabric and be able to read views from a datalake?

Thank you