How to consume Fabric Datawarehouse inside a Databricks notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-14-2025 11:09 AM
Hello,
I'm having a hard time figuring out (and finding the right documentation) to be able to connect my databricks notebook to consume tables from a fabric datawarehouse.
I've checked this, but seems to work only with onelake and this, but I'm not seeing how to use the JDBC/ODB driver to use the Entra authenticator.
Any hint into what I should be looking at?
Thank you
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-14-2025 11:26 AM - edited 01-14-2025 11:26 AM
I will look into this, I was checking on Lakehouse Federation options but seems that Fabric is not included as supported database system
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-14-2025 11:47 AM
Our team has confirmed that it is not recommended to make this direct connection, as of now the recommended solution is to use ADF or FDF to copy that data out into ADLS and then govern that ADLS with UC
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-26-2025 02:42 AM
Hi, thank you. There was some issues with my permissions (apparently to the Dataverse resource, not the Datalake) and that is why I was not able to query successfully using abfss. Just for the future, I was able to query directly the Datalake (not the endpoint) using
spark.read.format('delta').load('abfss://<workspace_id>@onelake.dfs.fabric.microsoft.com/<lakehouse_id>/Tables/dbo/<table_name>')
You can also write there and should show successfully in the endpoint unless this happens
This was quite useful documentation also
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-29-2025 02:51 AM
Hello, I would like to get a bit more options regarding reading Views. Using the abfss is fine for reading tables, but I don't know how to load Views, which are visible in the SQL Endpoint. Is there any alternative for connecting to Fabric and be able to read views from a datalake?
Thank you

