04-26-2023 02:01 AM
I have created an Azure data factory pipeline with a copy data function to copy data from adls path to a delta table .In the delta table drop downs i am able to see only the hive metastore database and tables only but the unity catalog tables are not listed .Can anyone please help me on this.
04-26-2023 10:07 PM
@Rama Teja Reddy Damireddy :
To see the Unity Catalog databases and tables in the drop-down in Azure Databricks Delta Lake, you need to configure the integration between Unity Catalog and Databricks. Here are the steps to configure the integration:
If you have already followed these steps and are still not seeing the Unity Catalog databases and tables, you may want to check if the user account you are using has the necessary permissions to access the Unity Catalog.
04-28-2023 06:20 AM
I also have the same question. I don't think you understand what is being asked. Within Azure Data Factory, when creating an Azure Databricks DeltaLake Dataset, there does not appear to be a way to select a database (schema) that is not in the hive_metastore catalog. See attached screenshot. Is UC not currently supported? I've sent the same question to Microsoft.
07-10-2023 04:17 AM
Hi, Were you able to solve the problem? I am also trying to do same
04-27-2023 12:49 AM
Hi @Rama Teja Reddy Damireddy
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
01-05-2024 03:14 AM
Hi,
Exact same problem here, any updates please ?
Thanks
01-10-2024 06:29 AM
Hi,
I have the same issue.
Additional information: the linked service created in azure data factory using azure databricks deltalake connector is using system-managed-identity rather than Token.
Could we have an update?
Thank you in advance.
a week ago
Dear community,
I am facing the same issue here.
My workaround is to just create an azure databricks activity in my ADF pipeline and use code inside a notebook to copy over data from tables inside unity catalog to ADLS gen2, but this is in my opinion less ideal then just using a azure databricks deltalake connector. Another workaround could be a view in the metastore that references tables in unity catalog, also not a very pretty solution.
is there some other solution that you would prescribe as best-practice for this usecase?
Thanks,
Bram
a week ago
Hi,
I need to ingest data into Databricks Unity Catalog using ADF, so a workaround is to use an object storage as a Landingzone (ADLS gen2), that is mounted into Databricks under /mnt. I use ADF to copy data (incrementally) to this landingzone first, then use Databricks, to take this data using autoloader into Bronzezone which is under UnityCatalog. Of course, one would expect to perform this data ingestion directly to UnityCatalog schemas, using ADF, but this seems impossible, so far.
Thanks
Excited to expand your horizons with us? Click here to Register and begin your journey to success!
Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!