How to migrate Git repos with DLT configurations
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-22-2024 05:52 AM
Hi!
I want to migrate all my databricks related code from one github repo to another. I knew this wouldn't be straight forward. When I copy my code for one DLT, I get the error
org.apache.spark.sql.catalyst.ExtendedAnalysisException: Table 'vessel_battery' is already managed by pipeline <pipeline ID>. A table can only be owned by one pipeline. Concurrent pipeline operations such as maintenance and full refresh will conflict with each other. Please rename the table 'vessel_battery' to proceed.
The tables are multi-terrabytes of data and should not be removed/overwritten. Is it possible to transfer the ownership of these tables? All DLT configs are setup using Databricks Asset Bundles.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-22-2024 06:27 AM
Would cloning the tables help your cause? You could probably try shallow or deep cloning as per your requirement.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-29-2024 01:24 AM
Does cloning take considerably less time then recreating the tables?
Can I resume append operations to a cloned table?

