Issue while Migration from hive metastrore to ucx
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2024 02:24 AM
- tables : 4 tables in the schema :databricks_log that are not being migrated error showing they are in dbfs root location and cannot be managed while they are in dbfs mount location
Example this model_notebook _logs :its location is dbfs:/mnt but in the logs of table_migration it shows this error
I have created the external location for all these tables showing correct path.
similarly 3 other tables show exact same errors.
What will be the possible reason for this
- Labels:
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2024 03:23 AM
@ashraf1395
To migrate managed tables stored at DBFS root to UC, you can do it through Deep Clone or Create Table As Select (CTAS). This also means that the HMS table data needs to be moved to the cloud storage location governed by UC. Please ensure the location given as the S3/ABFS path is anything that's not the metastore root storage path.
Deep clone: https://docs.databricks.com/sql/language-manual/delta-clone.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2024 05:04 AM
Got idea about locations. Though i was talking about external tables.
And how to do it in ucx correctly bcz the table migration workflow there has specific default methods for migration. Any method to add custom strategies in workflow?
Like for dbfs root tables it has default strategy of doing deep clone with managed table. but what if i want to do deep clone where source table is managed but dst table is external. We can do it manually by adding the extenral location in the query but any way to do it in ucx?

