Migrating data from hive metastore to unity catalog. data workflow is handled in fivetran
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-16-2025 11:01 PM
So in a uc migration project,
we have a fivetran connection which handles most of the etl processes and writes data into hive metastore.
we have migrated the schemas related to fivetran in UC.
The workspace where fivetran was running had default catalog as hive metastore , we have updated it to our uc catalog and the compute which runs the fivetran related etl work we have specified the default catalog to our uc catalog.
Do we have to make any changes from fivetran side as well? Not aware about it. Or that''s all we need to do to handle this migration.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-17-2025 12:18 AM
Hi @ashraf1395
I can think of following :
- Fivetran needs to be aware of the new catalog structure. This typically involves updating the destination settings in Fivetran to point to the Unity Catalog. Navigate to the destination settings for your Databricks connection.
- Ensure that Fivetran has the necessary permissions to read from and write to the Unity Catalog. create, read, update, and delete tables and schemas within the Unity Catalog
- Maybe you have to also look for each Fivetran connector that writes to Databricks is configured to use the Unity Catalog.
sample queries for uc:
-- Grant permissions to the service principal or user account
GRANT USAGE ON CATALOG your_uc_catalog TO 'fivetran_user';
GRANT USAGE ON SCHEMA your_uc_catalog.your_schema TO 'fivetran_user';
GRANT SELECT, INSERT, UPDATE, DELETE ON TABLE your_uc_catalog.your_schema.your_table TO 'fivetran_user';

