cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

In Azure Databricks Delta Lake not able to see unity catalog databases or tables in the drop down.

RamaTeja
New Contributor II

I have created an Azure data factory pipeline with a copy data function to copy data from adls path to a delta table .In the delta table drop downs i am able to see only the hive metastore database and tables only but the unity catalog tables are not listed .Can anyone please help me on this.

14 REPLIES 14

Anonymous
Not applicable

@Rama Teja Reddy Damireddyโ€‹ :

To see the Unity Catalog databases and tables in the drop-down in Azure Databricks Delta Lake, you need to configure the integration between Unity Catalog and Databricks. Here are the steps to configure the integration:

  1. Navigate to your Databricks workspace and click on the "Admin Console" button on the left-hand side of the screen.
  2. Click on the "Data" tab and then click on "Connect" next to Unity Catalog.
  3. Enter your Unity Catalog credentials and click "Connect".
  4. You should now see your Unity Catalog databases and tables in the drop-downs in Delta Lake.

If you have already followed these steps and are still not seeing the Unity Catalog databases and tables, you may want to check if the user account you are using has the necessary permissions to access the Unity Catalog.

Anonymous
Not applicable

I also have the same question. I don't think you understand what is being asked. Within Azure Data Factory, when creating an Azure Databricks DeltaLake Dataset, there does not appear to be a way to select a database (schema) that is not in the hive_metastore catalog. See attached screenshot. Is UC not currently supported? I've sent the same question to Microsoft.

NK_123
New Contributor II

Hi, Were you able to solve the problem? I am also trying to do same

Anonymous
Not applicable

Hi @Rama Teja Reddy Damireddyโ€‹ 

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 

cmunteanu
Contributor

Hi, 

Exact same problem here, any updates please ?

Thanks

latteuro
New Contributor II

Hi, 

I have the same issue.

Additional information: the linked service created in azure data factory using azure databricks deltalake connector is using system-managed-identity rather than Token. 

Could we have an update? 

Thank you in advance. 

Brammer88
New Contributor III

Dear community, 

I am facing the same issue here.

My workaround is to just create an azure databricks activity in my ADF pipeline and use code inside a notebook to copy over data from tables inside unity catalog to ADLS gen2, but this is in my opinion less ideal then just using a azure databricks deltalake connector. Another workaround could be a view in the metastore that references tables in unity catalog, also not a very pretty solution. 

is there some other solution that you would prescribe as best-practice for this usecase? 

Thanks,

Bram

cmunteanu
Contributor

Hi,

I need to ingest data into Databricks Unity Catalog using ADF, so a workaround is to use an object storage as a Landingzone (ADLS gen2), that is mounted into Databricks under /mnt. I use ADF to copy data (incrementally) to this landingzone first, then use Databricks, to take this data using autoloader into Bronzezone which is under UnityCatalog. Of course, one would expect to perform this data ingestion directly to UnityCatalog schemas, using ADF, but this seems impossible, so far.

Thanks

ChrisK
New Contributor II

I'm also experiencing the same issue.  This article describes a method for getting access to Unity Catalog through an ODBC linked service that connects through DB SQL Warehouse.  The article also indicates that direct Unity Catalog access via a Databricks Delta Lake linked service is limited depending on the security setup on the storage accounts used by Delta/DB.  This isn't an optimal setup IMO but an acceptable workaround.

Note from the trenches: Securely read or write data from Databricks Unity Catalog using Azure Data F...

StdyFriend1
Databricks Employee
Databricks Employee

Hi all,

Please try following this syntax in the database field 

<catalog_name>`.`<database_name> 

It is important that you use ` not '

This should force the connect to use a Unity Catalog Database. 

ChrisK
New Contributor II

Woo hoo!  Thank you so much for this suggestion.  It is working now. 

This seems like a bug in Azure Data Factory when it constructs the query to send to Databricks. The linked service connector must have been written pre-Unity catalog. We should be able to specify <catalog_name>, <database_name>, and <table_name> as separate parameters when configuring the linked service.

Jaganmalli
New Contributor II

Chrisk, Are you using " system-assigned -managed identity" or "access token"?

That's crazy, but it just works !!!! Thanks so much, Microsoft should include this trick in its documentation.

@StdyFriend1 I have tried your solution, but no luck. I am using system-assigned -managed identity in the linked service created in Azure data factory using "Azure Databricks delta lake connector".

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group